Mar 21 04:51:33 crc systemd[1]: Starting Kubernetes Kubelet... Mar 21 04:51:33 crc restorecon[4579]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:33 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:51:34 crc restorecon[4579]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:51:34 crc restorecon[4579]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 21 04:51:35 crc kubenswrapper[4580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:51:35 crc kubenswrapper[4580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 21 04:51:35 crc kubenswrapper[4580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:51:35 crc kubenswrapper[4580]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:51:35 crc kubenswrapper[4580]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 21 04:51:35 crc kubenswrapper[4580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.356966 4580 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360267 4580 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360292 4580 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360298 4580 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360303 4580 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360310 4580 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360316 4580 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360323 4580 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360330 4580 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360337 4580 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360344 4580 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360351 4580 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360356 4580 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360361 4580 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360366 4580 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360372 4580 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360376 4580 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360381 4580 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360384 4580 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360388 4580 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360392 4580 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360395 4580 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360407 4580 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360412 4580 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360416 4580 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360420 4580 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360425 4580 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360430 4580 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360437 4580 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360444 4580 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360449 4580 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360454 4580 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360460 4580 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360465 4580 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360471 4580 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360476 4580 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360483 4580 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360488 4580 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360496 4580 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360500 4580 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360505 4580 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360510 4580 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360515 4580 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360521 4580 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360526 4580 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360531 4580 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360535 4580 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360540 4580 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360544 4580 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360549 4580 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360553 4580 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360557 4580 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360560 4580 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360564 4580 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360567 4580 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360571 4580 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360574 4580 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360579 4580 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360582 4580 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360586 4580 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360589 4580 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360592 4580 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360596 4580 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360599 4580 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360603 4580 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360606 4580 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360611 4580 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360617 4580 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360623 4580 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360627 4580 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360631 4580 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.360635 4580 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361633 4580 flags.go:64] FLAG: --address="0.0.0.0" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361648 4580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361655 4580 flags.go:64] FLAG: --anonymous-auth="true" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361664 4580 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361670 4580 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361675 4580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361683 4580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361689 4580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361693 4580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361698 4580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361702 4580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361707 4580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361712 4580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361716 4580 flags.go:64] FLAG: --cgroup-root="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361721 4580 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361725 4580 flags.go:64] FLAG: --client-ca-file="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361729 4580 flags.go:64] FLAG: --cloud-config="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361733 4580 flags.go:64] FLAG: --cloud-provider="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361737 4580 flags.go:64] FLAG: --cluster-dns="[]" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361744 4580 flags.go:64] FLAG: --cluster-domain="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361748 4580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361752 4580 flags.go:64] FLAG: --config-dir="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361757 4580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361761 4580 flags.go:64] FLAG: --container-log-max-files="5" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361767 4580 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361771 4580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361788 4580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361793 4580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361797 4580 flags.go:64] FLAG: --contention-profiling="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361802 4580 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361806 4580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361810 4580 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361814 4580 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361820 4580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361824 4580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361829 4580 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361833 4580 flags.go:64] FLAG: --enable-load-reader="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361839 4580 flags.go:64] FLAG: --enable-server="true" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361843 4580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361848 4580 flags.go:64] FLAG: --event-burst="100" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361852 4580 flags.go:64] FLAG: --event-qps="50" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361856 4580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361860 4580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361864 4580 flags.go:64] FLAG: --eviction-hard="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361872 4580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361876 4580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361880 4580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361884 4580 flags.go:64] FLAG: --eviction-soft="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361888 4580 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361893 4580 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361897 4580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361901 4580 flags.go:64] FLAG: --experimental-mounter-path="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361905 4580 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361909 4580 flags.go:64] FLAG: --fail-swap-on="true" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361914 4580 flags.go:64] FLAG: --feature-gates="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361919 4580 flags.go:64] FLAG: --file-check-frequency="20s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361923 4580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361928 4580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361933 4580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361937 4580 flags.go:64] FLAG: --healthz-port="10248" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361941 4580 flags.go:64] FLAG: --help="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361946 4580 flags.go:64] FLAG: --hostname-override="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361951 4580 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361955 4580 flags.go:64] FLAG: --http-check-frequency="20s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361960 4580 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361964 4580 flags.go:64] FLAG: --image-credential-provider-config="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361968 4580 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361973 4580 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361977 4580 flags.go:64] FLAG: --image-service-endpoint="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361982 4580 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361986 4580 flags.go:64] FLAG: --kube-api-burst="100" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361991 4580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.361995 4580 flags.go:64] FLAG: --kube-api-qps="50" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362000 4580 flags.go:64] FLAG: --kube-reserved="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362005 4580 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362010 4580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362016 4580 flags.go:64] FLAG: --kubelet-cgroups="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362021 4580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362026 4580 flags.go:64] FLAG: --lock-file="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362032 4580 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362037 4580 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362042 4580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362050 4580 flags.go:64] FLAG: --log-json-split-stream="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362055 4580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362060 4580 flags.go:64] FLAG: --log-text-split-stream="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362065 4580 flags.go:64] FLAG: --logging-format="text" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362069 4580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362074 4580 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362080 4580 flags.go:64] FLAG: --manifest-url="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362085 4580 flags.go:64] FLAG: --manifest-url-header="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362091 4580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362096 4580 flags.go:64] FLAG: --max-open-files="1000000" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362102 4580 flags.go:64] FLAG: --max-pods="110" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362107 4580 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362111 4580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362116 4580 flags.go:64] FLAG: --memory-manager-policy="None" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362120 4580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362125 4580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362129 4580 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362133 4580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362144 4580 flags.go:64] FLAG: --node-status-max-images="50" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362149 4580 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362153 4580 flags.go:64] FLAG: --oom-score-adj="-999" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362157 4580 flags.go:64] FLAG: --pod-cidr="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362161 4580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362169 4580 flags.go:64] FLAG: --pod-manifest-path="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362174 4580 flags.go:64] FLAG: --pod-max-pids="-1" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362178 4580 flags.go:64] FLAG: --pods-per-core="0" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362183 4580 flags.go:64] FLAG: --port="10250" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362188 4580 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362192 4580 flags.go:64] FLAG: --provider-id="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362197 4580 flags.go:64] FLAG: --qos-reserved="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362201 4580 flags.go:64] FLAG: --read-only-port="10255" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362205 4580 flags.go:64] FLAG: --register-node="true" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362210 4580 flags.go:64] FLAG: --register-schedulable="true" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362214 4580 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362222 4580 flags.go:64] FLAG: --registry-burst="10" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362226 4580 flags.go:64] FLAG: --registry-qps="5" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362230 4580 flags.go:64] FLAG: --reserved-cpus="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362234 4580 flags.go:64] FLAG: --reserved-memory="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362240 4580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362244 4580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362249 4580 flags.go:64] FLAG: --rotate-certificates="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362253 4580 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362257 4580 flags.go:64] FLAG: --runonce="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362262 4580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362266 4580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362270 4580 flags.go:64] FLAG: --seccomp-default="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362274 4580 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362278 4580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362283 4580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362287 4580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362291 4580 flags.go:64] FLAG: --storage-driver-password="root" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362295 4580 flags.go:64] FLAG: --storage-driver-secure="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362299 4580 flags.go:64] FLAG: --storage-driver-table="stats" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362303 4580 flags.go:64] FLAG: --storage-driver-user="root" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362308 4580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362312 4580 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362316 4580 flags.go:64] FLAG: --system-cgroups="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362320 4580 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362327 4580 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362332 4580 flags.go:64] FLAG: --tls-cert-file="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362336 4580 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362341 4580 flags.go:64] FLAG: --tls-min-version="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362345 4580 flags.go:64] FLAG: --tls-private-key-file="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362350 4580 flags.go:64] FLAG: --topology-manager-policy="none" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362355 4580 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362359 4580 flags.go:64] FLAG: --topology-manager-scope="container" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362363 4580 flags.go:64] FLAG: --v="2" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362369 4580 flags.go:64] FLAG: --version="false" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362375 4580 flags.go:64] FLAG: --vmodule="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362381 4580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362386 4580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362499 4580 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362505 4580 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362509 4580 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362513 4580 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362518 4580 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362522 4580 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362526 4580 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362529 4580 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362533 4580 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362538 4580 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362543 4580 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362547 4580 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362551 4580 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362555 4580 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362559 4580 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362564 4580 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362567 4580 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362572 4580 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362576 4580 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362580 4580 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362583 4580 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362587 4580 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362590 4580 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362595 4580 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362599 4580 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362604 4580 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362607 4580 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362611 4580 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362616 4580 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362620 4580 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362623 4580 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362627 4580 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362631 4580 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362634 4580 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362638 4580 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362642 4580 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362645 4580 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362649 4580 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362652 4580 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362656 4580 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362659 4580 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362663 4580 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362666 4580 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362670 4580 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362673 4580 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362676 4580 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362681 4580 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362686 4580 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362690 4580 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362700 4580 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362704 4580 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362708 4580 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362712 4580 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362715 4580 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362719 4580 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362722 4580 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362726 4580 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362730 4580 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362734 4580 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362737 4580 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362741 4580 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362744 4580 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362748 4580 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362751 4580 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362759 4580 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362762 4580 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362766 4580 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362770 4580 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362773 4580 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362789 4580 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.362792 4580 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.362805 4580 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.374497 4580 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.374542 4580 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374625 4580 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374634 4580 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374640 4580 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374647 4580 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374653 4580 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374659 4580 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374663 4580 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374668 4580 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374673 4580 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374678 4580 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374683 4580 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374690 4580 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374696 4580 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374702 4580 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374708 4580 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374713 4580 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374718 4580 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374723 4580 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374728 4580 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374734 4580 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374739 4580 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374743 4580 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374748 4580 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374753 4580 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374759 4580 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374768 4580 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374774 4580 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374799 4580 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374804 4580 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374809 4580 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374815 4580 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374823 4580 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374829 4580 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374834 4580 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374840 4580 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374846 4580 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374851 4580 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374856 4580 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374861 4580 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374866 4580 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374871 4580 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374876 4580 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374881 4580 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374886 4580 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374891 4580 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374896 4580 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374901 4580 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374906 4580 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374911 4580 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374916 4580 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374921 4580 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374926 4580 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374931 4580 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374936 4580 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374942 4580 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374946 4580 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374951 4580 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374958 4580 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374964 4580 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374970 4580 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374976 4580 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374982 4580 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374986 4580 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374991 4580 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.374996 4580 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375003 4580 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375009 4580 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375015 4580 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375021 4580 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375026 4580 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375032 4580 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.375041 4580 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375190 4580 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375200 4580 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375206 4580 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375211 4580 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375217 4580 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375222 4580 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375227 4580 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375232 4580 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375238 4580 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375248 4580 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375254 4580 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375259 4580 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375264 4580 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375270 4580 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375277 4580 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375283 4580 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375288 4580 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375294 4580 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375299 4580 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375304 4580 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375309 4580 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375314 4580 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375319 4580 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375324 4580 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375329 4580 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375334 4580 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375339 4580 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375344 4580 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375349 4580 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375355 4580 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375364 4580 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375371 4580 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375377 4580 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375384 4580 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375391 4580 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375397 4580 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375402 4580 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375407 4580 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375412 4580 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375417 4580 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375423 4580 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375429 4580 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375435 4580 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375440 4580 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375445 4580 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375451 4580 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375456 4580 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375462 4580 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375468 4580 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375475 4580 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375481 4580 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375486 4580 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375492 4580 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375498 4580 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375504 4580 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375509 4580 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375514 4580 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375519 4580 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375524 4580 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375551 4580 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375557 4580 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375563 4580 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375569 4580 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375574 4580 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375579 4580 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375584 4580 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375589 4580 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375594 4580 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375599 4580 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375606 4580 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.375612 4580 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.375619 4580 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.376601 4580 server.go:940] "Client rotation is on, will bootstrap in background" Mar 21 04:51:35 crc kubenswrapper[4580]: E0321 04:51:35.380619 4580 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.386023 4580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.386164 4580 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.387890 4580 server.go:997] "Starting client certificate rotation" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.387931 4580 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.388128 4580 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.415663 4580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 21 04:51:35 crc kubenswrapper[4580]: E0321 04:51:35.417037 4580 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.422295 4580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.438105 4580 log.go:25] "Validated CRI v1 runtime API" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.479760 4580 log.go:25] "Validated CRI v1 image API" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.482086 4580 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.486811 4580 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-21-04-46-02-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.486849 4580 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.503186 4580 manager.go:217] Machine: {Timestamp:2026-03-21 04:51:35.500486438 +0000 UTC m=+0.583070106 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2800000 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:30104a4f-3cbf-4278-a817-16cb78d9b6b0 BootID:9b0468f0-788e-4966-a835-4b5e60e90122 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:88:52:78 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:88:52:78 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:38:20:e7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:11:63:05 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ef:41:44 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:f6:5d:b6 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:82:de:f7:43:e1:f2 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:5a:1c:a7:97:4c:8c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.503470 4580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.503680 4580 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.504434 4580 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.504833 4580 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.504916 4580 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.505298 4580 topology_manager.go:138] "Creating topology manager with none policy" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.505317 4580 container_manager_linux.go:303] "Creating device plugin manager" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.506231 4580 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.506295 4580 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.506677 4580 state_mem.go:36] "Initialized new in-memory state store" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.506861 4580 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.510380 4580 kubelet.go:418] "Attempting to sync node with API server" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.510452 4580 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.510549 4580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.510587 4580 kubelet.go:324] "Adding apiserver pod source" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.510609 4580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.516816 4580 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.517614 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:35 crc kubenswrapper[4580]: E0321 04:51:35.517978 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.518026 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:35 crc kubenswrapper[4580]: E0321 04:51:35.518196 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.518301 4580 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.521154 4580 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.522613 4580 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.522646 4580 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.522657 4580 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.522667 4580 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.522684 4580 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.522694 4580 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.522704 4580 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.522719 4580 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.522731 4580 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.522743 4580 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.522755 4580 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.522764 4580 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.524185 4580 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.524937 4580 server.go:1280] "Started kubelet" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.525363 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:35 crc systemd[1]: Started Kubernetes Kubelet. Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.528852 4580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.529233 4580 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.536664 4580 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.538313 4580 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.538436 4580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 21 04:51:35 crc kubenswrapper[4580]: E0321 04:51:35.539615 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.540070 4580 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.540092 4580 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.540828 4580 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 21 04:51:35 crc kubenswrapper[4580]: E0321 04:51:35.543457 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="200ms" Mar 21 04:51:35 crc kubenswrapper[4580]: E0321 04:51:35.551011 4580 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ec21443e6ebf8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.524895736 +0000 UTC m=+0.607479374,LastTimestamp:2026-03-21 04:51:35.524895736 +0000 UTC m=+0.607479374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.553070 4580 factory.go:55] Registering systemd factory Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.553156 4580 factory.go:221] Registration of the systemd container factory successfully Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.554436 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:35 crc kubenswrapper[4580]: E0321 04:51:35.554537 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.556333 4580 factory.go:153] Registering CRI-O factory Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.556366 4580 factory.go:221] Registration of the crio container factory successfully Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.556462 4580 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.556511 4580 factory.go:103] Registering Raw factory Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.556534 4580 manager.go:1196] Started watching for new ooms in manager Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.558914 4580 server.go:460] "Adding debug handlers to kubelet server" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.559125 4580 manager.go:319] Starting recovery of all containers Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.559111 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561113 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561177 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561203 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561225 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561247 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561271 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561295 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561348 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561372 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561403 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561425 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561447 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561473 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561494 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561548 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561571 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561593 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561615 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561636 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561658 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561681 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561706 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561730 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561752 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561775 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561882 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561909 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561932 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561955 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561975 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.561995 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562015 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562064 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562087 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562108 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562128 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562149 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562174 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562198 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562220 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562243 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562266 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562289 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562310 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562333 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562354 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562379 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562398 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562421 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562445 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562467 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562496 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562520 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562547 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562570 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562601 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562622 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562645 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562669 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562693 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562714 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562735 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562757 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562812 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562833 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562855 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562876 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562900 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562921 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562942 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562967 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.562988 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563012 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563033 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563055 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563078 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563100 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563121 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563145 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563167 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563190 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563221 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563244 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563267 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563288 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563310 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563334 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563355 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563379 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563403 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563427 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563447 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563467 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563487 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563507 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563528 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563548 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563569 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563589 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563610 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563630 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563650 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563673 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563701 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563723 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563741 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563758 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563774 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563812 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563829 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563854 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563876 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563899 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563921 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563938 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563953 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563969 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.563991 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564018 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564039 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564059 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564079 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564099 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564121 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564142 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564161 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564182 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564204 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564226 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564249 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564271 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564294 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564316 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564340 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564364 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564385 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564406 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564426 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564446 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564469 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564489 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564511 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564532 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564553 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564572 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564593 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564637 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564660 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564682 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564703 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.564765 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.567844 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.567889 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.571982 4580 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572033 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572057 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572080 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572101 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572123 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572146 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572225 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572244 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572265 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572285 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572303 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572345 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572373 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572392 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572411 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572445 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572466 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572486 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572504 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572528 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572550 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572569 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572586 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572607 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572626 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572643 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572660 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572677 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572700 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572720 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572738 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572755 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572773 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572858 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572886 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572907 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572928 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572948 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572971 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.572992 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573012 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573032 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573051 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573072 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573096 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573118 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573137 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573158 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573178 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573198 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573217 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573243 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573263 4580 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573279 4580 reconstruct.go:97] "Volume reconstruction finished" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573292 4580 reconciler.go:26] "Reconciler: start to sync state" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.573680 4580 manager.go:324] Recovery completed Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.588864 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.591550 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.591596 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.591611 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.592526 4580 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.592551 4580 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.592681 4580 state_mem.go:36] "Initialized new in-memory state store" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.613803 4580 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.616446 4580 policy_none.go:49] "None policy: Start" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.616481 4580 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.616530 4580 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.616557 4580 kubelet.go:2335] "Starting kubelet main sync loop" Mar 21 04:51:35 crc kubenswrapper[4580]: E0321 04:51:35.616608 4580 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.618624 4580 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.618741 4580 state_mem.go:35] "Initializing new in-memory state store" Mar 21 04:51:35 crc kubenswrapper[4580]: W0321 04:51:35.619015 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:35 crc kubenswrapper[4580]: E0321 04:51:35.619122 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:51:35 crc kubenswrapper[4580]: E0321 04:51:35.640101 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.668145 4580 manager.go:334] "Starting Device Plugin manager" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.668334 4580 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.668350 4580 server.go:79] "Starting device plugin registration server" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.668902 4580 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.668923 4580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.669268 4580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.669368 4580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.669385 4580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 21 04:51:35 crc kubenswrapper[4580]: E0321 04:51:35.678197 4580 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.717644 4580 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.717821 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.719346 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.719390 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.719407 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.719632 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.720166 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.720204 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.720767 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.720805 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.720813 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.721405 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.721448 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.721458 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.721810 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.721966 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.722037 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.723141 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.723161 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.723173 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.723146 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.723222 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.723234 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.723330 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.723398 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.723438 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.724118 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.724148 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.724160 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.724339 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.724356 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.724366 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.724438 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.724555 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.724602 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.725519 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.725541 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.725550 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.725612 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.725650 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.725666 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.726009 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.726084 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.727258 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.727298 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.727310 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:35 crc kubenswrapper[4580]: E0321 04:51:35.747886 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="400ms" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.769935 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.771253 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.771290 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.771301 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.771327 4580 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:51:35 crc kubenswrapper[4580]: E0321 04:51:35.771942 4580 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.775131 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.775176 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.775206 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.775230 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.775318 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.775424 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.775486 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.775521 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.775562 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.775591 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.775619 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.775654 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.775691 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.775734 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.775769 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876611 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876664 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876688 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876706 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876724 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876740 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876755 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876770 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876815 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876844 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876866 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876888 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876908 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876907 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876965 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876912 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.876979 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.877061 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.877081 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.877041 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.877101 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.877089 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.877209 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.877084 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.877243 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.877017 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.877272 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.877271 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.877290 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.877297 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.972149 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.974100 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.974174 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.974188 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:35 crc kubenswrapper[4580]: I0321 04:51:35.974215 4580 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:51:35 crc kubenswrapper[4580]: E0321 04:51:35.974683 4580 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.055220 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.063336 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.080817 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.099477 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.105550 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:51:36 crc kubenswrapper[4580]: W0321 04:51:36.143244 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-656da4d1dc7c5b9a0380abc1583a18fd1a504bfa40d67b2e5aef16d593a287e5 WatchSource:0}: Error finding container 656da4d1dc7c5b9a0380abc1583a18fd1a504bfa40d67b2e5aef16d593a287e5: Status 404 returned error can't find the container with id 656da4d1dc7c5b9a0380abc1583a18fd1a504bfa40d67b2e5aef16d593a287e5 Mar 21 04:51:36 crc kubenswrapper[4580]: W0321 04:51:36.146162 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-5443b0b5be89e879e2bb84b10ff97cf6d0fd58d4b0e55763e5623eccee8acd95 WatchSource:0}: Error finding container 5443b0b5be89e879e2bb84b10ff97cf6d0fd58d4b0e55763e5623eccee8acd95: Status 404 returned error can't find the container with id 5443b0b5be89e879e2bb84b10ff97cf6d0fd58d4b0e55763e5623eccee8acd95 Mar 21 04:51:36 crc kubenswrapper[4580]: W0321 04:51:36.148440 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-4503f37e56fadac8ecc29ab5878abc949b12443ca4be804e077243a441ee0bad WatchSource:0}: Error finding container 4503f37e56fadac8ecc29ab5878abc949b12443ca4be804e077243a441ee0bad: Status 404 returned error can't find the container with id 4503f37e56fadac8ecc29ab5878abc949b12443ca4be804e077243a441ee0bad Mar 21 04:51:36 crc kubenswrapper[4580]: E0321 04:51:36.148584 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="800ms" Mar 21 04:51:36 crc kubenswrapper[4580]: W0321 04:51:36.165569 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-55dc2cb8c41f5582a75fcad4d80354349dd0df3dbb4845653e53a1b1f425473c WatchSource:0}: Error finding container 55dc2cb8c41f5582a75fcad4d80354349dd0df3dbb4845653e53a1b1f425473c: Status 404 returned error can't find the container with id 55dc2cb8c41f5582a75fcad4d80354349dd0df3dbb4845653e53a1b1f425473c Mar 21 04:51:36 crc kubenswrapper[4580]: W0321 04:51:36.348114 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:36 crc kubenswrapper[4580]: E0321 04:51:36.348540 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.377937 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.381932 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.381971 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.381982 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.382015 4580 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:51:36 crc kubenswrapper[4580]: E0321 04:51:36.382628 4580 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.527457 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.620998 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"55dc2cb8c41f5582a75fcad4d80354349dd0df3dbb4845653e53a1b1f425473c"} Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.621982 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d0b5628500fd78eae21cfeb055802abf04b6dcbb430a49c6a7967345cdc54436"} Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.623436 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4503f37e56fadac8ecc29ab5878abc949b12443ca4be804e077243a441ee0bad"} Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.624698 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"656da4d1dc7c5b9a0380abc1583a18fd1a504bfa40d67b2e5aef16d593a287e5"} Mar 21 04:51:36 crc kubenswrapper[4580]: I0321 04:51:36.625454 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5443b0b5be89e879e2bb84b10ff97cf6d0fd58d4b0e55763e5623eccee8acd95"} Mar 21 04:51:36 crc kubenswrapper[4580]: W0321 04:51:36.718832 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:36 crc kubenswrapper[4580]: E0321 04:51:36.718944 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:51:36 crc kubenswrapper[4580]: W0321 04:51:36.858593 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:36 crc kubenswrapper[4580]: E0321 04:51:36.858688 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:51:36 crc kubenswrapper[4580]: E0321 04:51:36.950567 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="1.6s" Mar 21 04:51:37 crc kubenswrapper[4580]: W0321 04:51:37.018706 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:37 crc kubenswrapper[4580]: E0321 04:51:37.018825 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.183616 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.185279 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.185331 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.185344 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.185382 4580 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:51:37 crc kubenswrapper[4580]: E0321 04:51:37.185989 4580 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.453637 4580 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:51:37 crc kubenswrapper[4580]: E0321 04:51:37.455062 4580 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.527513 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.631459 4580 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41" exitCode=0 Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.631626 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41"} Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.631652 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.633142 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.633211 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.633231 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.633341 4580 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828" exitCode=0 Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.633466 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828"} Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.633497 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.634461 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.634490 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.634502 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.637430 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68"} Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.637517 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e"} Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.638975 4580 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9" exitCode=0 Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.639089 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9"} Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.639204 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.640830 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.640855 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.640865 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.642180 4580 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="66879439fc53813893d44ee5736e7248e2cf88eed3dfa7198829dd536598f8af" exitCode=0 Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.642252 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"66879439fc53813893d44ee5736e7248e2cf88eed3dfa7198829dd536598f8af"} Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.642350 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.644000 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.644088 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.644105 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.652824 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.655187 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.655228 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:37 crc kubenswrapper[4580]: I0321 04:51:37.655241 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.527316 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:38 crc kubenswrapper[4580]: E0321 04:51:38.551227 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="3.2s" Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.646496 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"40eb04b778cb587d9b7938ac13c10070df9e92b5370ca3003492e52c5716022e"} Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.648388 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6e34cce0b8b2645887d3fb044b916f5c1b348ed6ac52ff3d47b8c926223993d3"} Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.648417 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.649856 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.649896 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.649909 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.650693 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882"} Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.653064 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32"} Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.655568 4580 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46" exitCode=0 Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.655625 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46"} Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.655708 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.656687 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.656724 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.656738 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:38 crc kubenswrapper[4580]: W0321 04:51:38.677309 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:38 crc kubenswrapper[4580]: E0321 04:51:38.677409 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.786284 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.787284 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.787317 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.787326 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:38 crc kubenswrapper[4580]: I0321 04:51:38.787347 4580 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:51:38 crc kubenswrapper[4580]: E0321 04:51:38.787759 4580 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Mar 21 04:51:38 crc kubenswrapper[4580]: W0321 04:51:38.799337 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:38 crc kubenswrapper[4580]: E0321 04:51:38.799407 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.527403 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:39 crc kubenswrapper[4580]: W0321 04:51:39.637721 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:39 crc kubenswrapper[4580]: E0321 04:51:39.637801 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.660755 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429"} Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.660992 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.662481 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.662705 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.662721 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.664451 4580 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f" exitCode=0 Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.664508 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f"} Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.664604 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.666309 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.666332 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.666341 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:39 crc kubenswrapper[4580]: W0321 04:51:39.675304 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 21 04:51:39 crc kubenswrapper[4580]: E0321 04:51:39.675383 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.695305 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.696562 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0bbfc04456bbc9d1cfbebfdcc34be31c680afe3a7b210a663cbb214c4f929624"} Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.696599 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3d453904d9212dd1ff8de7a05c8a5922e5cf807595bbcb3742a2488190c557d8"} Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.698919 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.698976 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.698993 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.713519 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.713972 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50"} Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.714005 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930"} Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.714017 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b"} Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.714477 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.714503 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.714512 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:39 crc kubenswrapper[4580]: I0321 04:51:39.763231 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.340206 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.718952 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6ffa543a63312c4096ef554f96f08a530551a8d74e998101824831879bd78d75"} Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.719151 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.720041 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.720075 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.720087 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.723838 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.724257 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7d40733302e6914f45c270751c177d3b83a20ff24a950b73a289fcf7abadd3da"} Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.724299 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c903f61cf0e6b078fa36787397aa50d7244f2f58db68e6eaab165f0b44dacd7c"} Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.724316 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"25e55cf6b1c2feb181c0c5139f51f295f900960f5d429a616ebe01bed366c67c"} Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.724327 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a719c23dcc7675e21e14902c6fe2ef4c18bc715a610599abbe9f2b775da9c25e"} Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.724394 4580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.724420 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.725083 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.725112 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.725125 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.725177 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.725200 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:40 crc kubenswrapper[4580]: I0321 04:51:40.725212 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.738983 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"77c8717bf5a92472b2134747e609ccf93c68e68a2021d51be80e4ca5aae8042e"} Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.739102 4580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.739136 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.739162 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.739216 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.740698 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.740758 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.740822 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.742039 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.742069 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.742081 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.742093 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.742136 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.742159 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.742313 4580 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.988139 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.990443 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.990513 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.990535 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:41 crc kubenswrapper[4580]: I0321 04:51:41.990576 4580 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:51:42 crc kubenswrapper[4580]: I0321 04:51:42.742187 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:42 crc kubenswrapper[4580]: I0321 04:51:42.743741 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:42 crc kubenswrapper[4580]: I0321 04:51:42.743842 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:42 crc kubenswrapper[4580]: I0321 04:51:42.743862 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:43 crc kubenswrapper[4580]: I0321 04:51:43.048932 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:43 crc kubenswrapper[4580]: I0321 04:51:43.049195 4580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 04:51:43 crc kubenswrapper[4580]: I0321 04:51:43.049275 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:43 crc kubenswrapper[4580]: I0321 04:51:43.051242 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:43 crc kubenswrapper[4580]: I0321 04:51:43.051301 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:43 crc kubenswrapper[4580]: I0321 04:51:43.051314 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:43 crc kubenswrapper[4580]: I0321 04:51:43.149406 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 21 04:51:43 crc kubenswrapper[4580]: I0321 04:51:43.340463 4580 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:51:43 crc kubenswrapper[4580]: I0321 04:51:43.340615 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:51:43 crc kubenswrapper[4580]: I0321 04:51:43.745348 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:43 crc kubenswrapper[4580]: I0321 04:51:43.746508 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:43 crc kubenswrapper[4580]: I0321 04:51:43.746583 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:43 crc kubenswrapper[4580]: I0321 04:51:43.746608 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:44 crc kubenswrapper[4580]: I0321 04:51:44.265165 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:51:44 crc kubenswrapper[4580]: I0321 04:51:44.265460 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:44 crc kubenswrapper[4580]: I0321 04:51:44.267500 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:44 crc kubenswrapper[4580]: I0321 04:51:44.267574 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:44 crc kubenswrapper[4580]: I0321 04:51:44.267600 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:44 crc kubenswrapper[4580]: I0321 04:51:44.809496 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:44 crc kubenswrapper[4580]: I0321 04:51:44.809773 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:44 crc kubenswrapper[4580]: I0321 04:51:44.811483 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:44 crc kubenswrapper[4580]: I0321 04:51:44.811546 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:44 crc kubenswrapper[4580]: I0321 04:51:44.811572 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:45 crc kubenswrapper[4580]: I0321 04:51:45.139314 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:45 crc kubenswrapper[4580]: E0321 04:51:45.678656 4580 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:51:45 crc kubenswrapper[4580]: I0321 04:51:45.750565 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:45 crc kubenswrapper[4580]: I0321 04:51:45.756297 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:45 crc kubenswrapper[4580]: I0321 04:51:45.756357 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:45 crc kubenswrapper[4580]: I0321 04:51:45.756373 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.244099 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.244258 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.245250 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.245295 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.245306 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.823752 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.824022 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.825733 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.825846 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.825867 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.971840 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.972020 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.973260 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.973290 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.973298 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:46 crc kubenswrapper[4580]: I0321 04:51:46.980319 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:51:47 crc kubenswrapper[4580]: I0321 04:51:47.755540 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:47 crc kubenswrapper[4580]: I0321 04:51:47.756474 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:47 crc kubenswrapper[4580]: I0321 04:51:47.756506 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:47 crc kubenswrapper[4580]: I0321 04:51:47.756517 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:47 crc kubenswrapper[4580]: I0321 04:51:47.763423 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:51:48 crc kubenswrapper[4580]: I0321 04:51:48.758229 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:48 crc kubenswrapper[4580]: I0321 04:51:48.759661 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:48 crc kubenswrapper[4580]: I0321 04:51:48.759744 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:48 crc kubenswrapper[4580]: I0321 04:51:48.759764 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:50 crc kubenswrapper[4580]: I0321 04:51:50.528064 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 21 04:51:50 crc kubenswrapper[4580]: E0321 04:51:50.715643 4580 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:51:50 crc kubenswrapper[4580]: E0321 04:51:50.718153 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:50Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 21 04:51:50 crc kubenswrapper[4580]: E0321 04:51:50.720533 4580 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:50Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:51:50 crc kubenswrapper[4580]: W0321 04:51:50.723238 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:50Z is after 2026-02-23T05:33:13Z Mar 21 04:51:50 crc kubenswrapper[4580]: E0321 04:51:50.723330 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:51:50 crc kubenswrapper[4580]: W0321 04:51:50.728414 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:50Z is after 2026-02-23T05:33:13Z Mar 21 04:51:50 crc kubenswrapper[4580]: E0321 04:51:50.728491 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:51:50 crc kubenswrapper[4580]: I0321 04:51:50.731821 4580 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 21 04:51:50 crc kubenswrapper[4580]: I0321 04:51:50.731896 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 21 04:51:50 crc kubenswrapper[4580]: W0321 04:51:50.734280 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:50Z is after 2026-02-23T05:33:13Z Mar 21 04:51:50 crc kubenswrapper[4580]: E0321 04:51:50.734367 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:51:50 crc kubenswrapper[4580]: W0321 04:51:50.735477 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:50Z is after 2026-02-23T05:33:13Z Mar 21 04:51:50 crc kubenswrapper[4580]: E0321 04:51:50.735526 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:51:50 crc kubenswrapper[4580]: E0321 04:51:50.736853 4580 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:50Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ec21443e6ebf8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.524895736 +0000 UTC m=+0.607479374,LastTimestamp:2026-03-21 04:51:35.524895736 +0000 UTC m=+0.607479374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:51:50 crc kubenswrapper[4580]: I0321 04:51:50.743052 4580 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]log ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]etcd ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/generic-apiserver-start-informers ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/priority-and-fairness-filter ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/start-apiextensions-informers ok Mar 21 04:51:50 crc kubenswrapper[4580]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 21 04:51:50 crc kubenswrapper[4580]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/start-system-namespaces-controller ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 21 04:51:50 crc kubenswrapper[4580]: [-]poststarthook/start-service-ip-repair-controllers failed: reason withheld Mar 21 04:51:50 crc kubenswrapper[4580]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 21 04:51:50 crc kubenswrapper[4580]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 21 04:51:50 crc kubenswrapper[4580]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Mar 21 04:51:50 crc kubenswrapper[4580]: [-]poststarthook/bootstrap-controller failed: reason withheld Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/start-kube-aggregator-informers ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 21 04:51:50 crc kubenswrapper[4580]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 21 04:51:50 crc kubenswrapper[4580]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 21 04:51:50 crc kubenswrapper[4580]: [-]autoregister-completion failed: reason withheld Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/apiservice-openapi-controller ok Mar 21 04:51:50 crc kubenswrapper[4580]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 21 04:51:50 crc kubenswrapper[4580]: livez check failed Mar 21 04:51:50 crc kubenswrapper[4580]: I0321 04:51:50.743124 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:51 crc kubenswrapper[4580]: I0321 04:51:51.529861 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:51Z is after 2026-02-23T05:33:13Z Mar 21 04:51:51 crc kubenswrapper[4580]: I0321 04:51:51.767049 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 21 04:51:51 crc kubenswrapper[4580]: I0321 04:51:51.768763 4580 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6ffa543a63312c4096ef554f96f08a530551a8d74e998101824831879bd78d75" exitCode=255 Mar 21 04:51:51 crc kubenswrapper[4580]: I0321 04:51:51.768825 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6ffa543a63312c4096ef554f96f08a530551a8d74e998101824831879bd78d75"} Mar 21 04:51:51 crc kubenswrapper[4580]: I0321 04:51:51.769001 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:51 crc kubenswrapper[4580]: I0321 04:51:51.769832 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:51 crc kubenswrapper[4580]: I0321 04:51:51.769856 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:51 crc kubenswrapper[4580]: I0321 04:51:51.769866 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:51 crc kubenswrapper[4580]: I0321 04:51:51.770362 4580 scope.go:117] "RemoveContainer" containerID="6ffa543a63312c4096ef554f96f08a530551a8d74e998101824831879bd78d75" Mar 21 04:51:52 crc kubenswrapper[4580]: I0321 04:51:52.529891 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:52Z is after 2026-02-23T05:33:13Z Mar 21 04:51:52 crc kubenswrapper[4580]: I0321 04:51:52.774063 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:51:52 crc kubenswrapper[4580]: I0321 04:51:52.775206 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 21 04:51:52 crc kubenswrapper[4580]: I0321 04:51:52.778202 4580 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="213c6932301a1f4cee16e4949d1dd345e16bfcb9c6c32ffaa0c619c1fa0f1406" exitCode=255 Mar 21 04:51:52 crc kubenswrapper[4580]: I0321 04:51:52.778294 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"213c6932301a1f4cee16e4949d1dd345e16bfcb9c6c32ffaa0c619c1fa0f1406"} Mar 21 04:51:52 crc kubenswrapper[4580]: I0321 04:51:52.778413 4580 scope.go:117] "RemoveContainer" containerID="6ffa543a63312c4096ef554f96f08a530551a8d74e998101824831879bd78d75" Mar 21 04:51:52 crc kubenswrapper[4580]: I0321 04:51:52.778638 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:52 crc kubenswrapper[4580]: I0321 04:51:52.780509 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:52 crc kubenswrapper[4580]: I0321 04:51:52.780573 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:52 crc kubenswrapper[4580]: I0321 04:51:52.780596 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:52 crc kubenswrapper[4580]: I0321 04:51:52.782099 4580 scope.go:117] "RemoveContainer" containerID="213c6932301a1f4cee16e4949d1dd345e16bfcb9c6c32ffaa0c619c1fa0f1406" Mar 21 04:51:52 crc kubenswrapper[4580]: E0321 04:51:52.782518 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:51:53 crc kubenswrapper[4580]: I0321 04:51:53.055482 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:53 crc kubenswrapper[4580]: I0321 04:51:53.341294 4580 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:51:53 crc kubenswrapper[4580]: I0321 04:51:53.341492 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:51:53 crc kubenswrapper[4580]: I0321 04:51:53.530248 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:53Z is after 2026-02-23T05:33:13Z Mar 21 04:51:53 crc kubenswrapper[4580]: I0321 04:51:53.569072 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:53 crc kubenswrapper[4580]: I0321 04:51:53.785016 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:51:53 crc kubenswrapper[4580]: I0321 04:51:53.787558 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:53 crc kubenswrapper[4580]: I0321 04:51:53.789192 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:53 crc kubenswrapper[4580]: I0321 04:51:53.789232 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:53 crc kubenswrapper[4580]: I0321 04:51:53.789244 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:53 crc kubenswrapper[4580]: I0321 04:51:53.789808 4580 scope.go:117] "RemoveContainer" containerID="213c6932301a1f4cee16e4949d1dd345e16bfcb9c6c32ffaa0c619c1fa0f1406" Mar 21 04:51:53 crc kubenswrapper[4580]: E0321 04:51:53.789979 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:51:53 crc kubenswrapper[4580]: I0321 04:51:53.795495 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:54 crc kubenswrapper[4580]: I0321 04:51:54.531451 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:54Z is after 2026-02-23T05:33:13Z Mar 21 04:51:54 crc kubenswrapper[4580]: I0321 04:51:54.790353 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:54 crc kubenswrapper[4580]: I0321 04:51:54.791887 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:54 crc kubenswrapper[4580]: I0321 04:51:54.791954 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:54 crc kubenswrapper[4580]: I0321 04:51:54.791981 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:54 crc kubenswrapper[4580]: I0321 04:51:54.793504 4580 scope.go:117] "RemoveContainer" containerID="213c6932301a1f4cee16e4949d1dd345e16bfcb9c6c32ffaa0c619c1fa0f1406" Mar 21 04:51:54 crc kubenswrapper[4580]: E0321 04:51:54.793834 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:51:54 crc kubenswrapper[4580]: I0321 04:51:54.809736 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:51:55 crc kubenswrapper[4580]: I0321 04:51:55.529816 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:55Z is after 2026-02-23T05:33:13Z Mar 21 04:51:55 crc kubenswrapper[4580]: E0321 04:51:55.678757 4580 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:51:55 crc kubenswrapper[4580]: I0321 04:51:55.792622 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:55 crc kubenswrapper[4580]: I0321 04:51:55.793575 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:55 crc kubenswrapper[4580]: I0321 04:51:55.793641 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:55 crc kubenswrapper[4580]: I0321 04:51:55.793652 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:55 crc kubenswrapper[4580]: I0321 04:51:55.794265 4580 scope.go:117] "RemoveContainer" containerID="213c6932301a1f4cee16e4949d1dd345e16bfcb9c6c32ffaa0c619c1fa0f1406" Mar 21 04:51:55 crc kubenswrapper[4580]: E0321 04:51:55.794408 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:51:56 crc kubenswrapper[4580]: I0321 04:51:56.530195 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:56Z is after 2026-02-23T05:33:13Z Mar 21 04:51:56 crc kubenswrapper[4580]: I0321 04:51:56.795273 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:56 crc kubenswrapper[4580]: I0321 04:51:56.796383 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:56 crc kubenswrapper[4580]: I0321 04:51:56.796419 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:56 crc kubenswrapper[4580]: I0321 04:51:56.796435 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:56 crc kubenswrapper[4580]: I0321 04:51:56.797251 4580 scope.go:117] "RemoveContainer" containerID="213c6932301a1f4cee16e4949d1dd345e16bfcb9c6c32ffaa0c619c1fa0f1406" Mar 21 04:51:56 crc kubenswrapper[4580]: E0321 04:51:56.797511 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:51:56 crc kubenswrapper[4580]: I0321 04:51:56.857122 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 21 04:51:56 crc kubenswrapper[4580]: I0321 04:51:56.857472 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:56 crc kubenswrapper[4580]: I0321 04:51:56.859347 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:56 crc kubenswrapper[4580]: I0321 04:51:56.859406 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:56 crc kubenswrapper[4580]: I0321 04:51:56.859425 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:56 crc kubenswrapper[4580]: I0321 04:51:56.881824 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 21 04:51:57 crc kubenswrapper[4580]: I0321 04:51:57.121225 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:57 crc kubenswrapper[4580]: E0321 04:51:57.121262 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:57Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 21 04:51:57 crc kubenswrapper[4580]: I0321 04:51:57.125808 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:57 crc kubenswrapper[4580]: I0321 04:51:57.125842 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:57 crc kubenswrapper[4580]: I0321 04:51:57.125853 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:57 crc kubenswrapper[4580]: I0321 04:51:57.125879 4580 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:51:57 crc kubenswrapper[4580]: E0321 04:51:57.128569 4580 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:57Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:51:57 crc kubenswrapper[4580]: I0321 04:51:57.530776 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:57Z is after 2026-02-23T05:33:13Z Mar 21 04:51:57 crc kubenswrapper[4580]: W0321 04:51:57.541251 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:57Z is after 2026-02-23T05:33:13Z Mar 21 04:51:57 crc kubenswrapper[4580]: E0321 04:51:57.541366 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:57Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:51:57 crc kubenswrapper[4580]: I0321 04:51:57.798035 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:51:57 crc kubenswrapper[4580]: I0321 04:51:57.799609 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:51:57 crc kubenswrapper[4580]: I0321 04:51:57.799646 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:51:57 crc kubenswrapper[4580]: I0321 04:51:57.799657 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:51:58 crc kubenswrapper[4580]: I0321 04:51:58.531223 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:58Z is after 2026-02-23T05:33:13Z Mar 21 04:51:58 crc kubenswrapper[4580]: W0321 04:51:58.789067 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:58Z is after 2026-02-23T05:33:13Z Mar 21 04:51:58 crc kubenswrapper[4580]: E0321 04:51:58.789171 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:51:59 crc kubenswrapper[4580]: I0321 04:51:59.048234 4580 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:51:59 crc kubenswrapper[4580]: E0321 04:51:59.053360 4580 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:51:59 crc kubenswrapper[4580]: W0321 04:51:59.128830 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:59Z is after 2026-02-23T05:33:13Z Mar 21 04:51:59 crc kubenswrapper[4580]: E0321 04:51:59.129099 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:51:59 crc kubenswrapper[4580]: I0321 04:51:59.529688 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:51:59Z is after 2026-02-23T05:33:13Z Mar 21 04:52:00 crc kubenswrapper[4580]: I0321 04:52:00.529412 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:00Z is after 2026-02-23T05:33:13Z Mar 21 04:52:00 crc kubenswrapper[4580]: E0321 04:52:00.741418 4580 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:00Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ec21443e6ebf8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.524895736 +0000 UTC m=+0.607479374,LastTimestamp:2026-03-21 04:51:35.524895736 +0000 UTC m=+0.607479374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:01 crc kubenswrapper[4580]: I0321 04:52:01.529082 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:01Z is after 2026-02-23T05:33:13Z Mar 21 04:52:01 crc kubenswrapper[4580]: W0321 04:52:01.630239 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:01Z is after 2026-02-23T05:33:13Z Mar 21 04:52:01 crc kubenswrapper[4580]: E0321 04:52:01.630316 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:52:02 crc kubenswrapper[4580]: I0321 04:52:02.534309 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.340552 4580 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.340634 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.340695 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.340874 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.344733 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.345111 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.345236 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.345735 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.346080 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68" gracePeriod=30 Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.531798 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:03Z is after 2026-02-23T05:33:13Z Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.816883 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.817738 4580 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68" exitCode=255 Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.817805 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68"} Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.817838 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14"} Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.817938 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.818683 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.818717 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:03 crc kubenswrapper[4580]: I0321 04:52:03.818726 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:04 crc kubenswrapper[4580]: E0321 04:52:04.128616 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:04Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 21 04:52:04 crc kubenswrapper[4580]: I0321 04:52:04.128669 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:04 crc kubenswrapper[4580]: I0321 04:52:04.130879 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:04 crc kubenswrapper[4580]: I0321 04:52:04.130990 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:04 crc kubenswrapper[4580]: I0321 04:52:04.131016 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:04 crc kubenswrapper[4580]: I0321 04:52:04.131066 4580 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:52:04 crc kubenswrapper[4580]: E0321 04:52:04.134346 4580 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:04Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:52:04 crc kubenswrapper[4580]: I0321 04:52:04.532438 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:04Z is after 2026-02-23T05:33:13Z Mar 21 04:52:05 crc kubenswrapper[4580]: I0321 04:52:05.530066 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:05Z is after 2026-02-23T05:33:13Z Mar 21 04:52:05 crc kubenswrapper[4580]: E0321 04:52:05.678908 4580 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:52:06 crc kubenswrapper[4580]: I0321 04:52:06.243456 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:52:06 crc kubenswrapper[4580]: I0321 04:52:06.243714 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:06 crc kubenswrapper[4580]: I0321 04:52:06.245259 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:06 crc kubenswrapper[4580]: I0321 04:52:06.245334 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:06 crc kubenswrapper[4580]: I0321 04:52:06.245362 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:06 crc kubenswrapper[4580]: I0321 04:52:06.529458 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:06Z is after 2026-02-23T05:33:13Z Mar 21 04:52:07 crc kubenswrapper[4580]: I0321 04:52:07.531280 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:07Z is after 2026-02-23T05:33:13Z Mar 21 04:52:08 crc kubenswrapper[4580]: I0321 04:52:08.529370 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:08Z is after 2026-02-23T05:33:13Z Mar 21 04:52:09 crc kubenswrapper[4580]: I0321 04:52:09.532040 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:09Z is after 2026-02-23T05:33:13Z Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.341688 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.342262 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.343238 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.343369 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.343468 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.531683 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:10Z is after 2026-02-23T05:33:13Z Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.617452 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.618900 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.618951 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.618964 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.619661 4580 scope.go:117] "RemoveContainer" containerID="213c6932301a1f4cee16e4949d1dd345e16bfcb9c6c32ffaa0c619c1fa0f1406" Mar 21 04:52:10 crc kubenswrapper[4580]: E0321 04:52:10.745625 4580 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:10Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ec21443e6ebf8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.524895736 +0000 UTC m=+0.607479374,LastTimestamp:2026-03-21 04:51:35.524895736 +0000 UTC m=+0.607479374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.842990 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.845371 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"65085743d7ee152a030b3a8764759c08e2b7178f0f1e3c65a908a8c84227b881"} Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.845511 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.846372 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.846402 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:10 crc kubenswrapper[4580]: I0321 04:52:10.846414 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:11 crc kubenswrapper[4580]: E0321 04:52:11.134183 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:11Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 21 04:52:11 crc kubenswrapper[4580]: I0321 04:52:11.134439 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:11 crc kubenswrapper[4580]: I0321 04:52:11.135752 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:11 crc kubenswrapper[4580]: I0321 04:52:11.135817 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:11 crc kubenswrapper[4580]: I0321 04:52:11.135829 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:11 crc kubenswrapper[4580]: I0321 04:52:11.135852 4580 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:52:11 crc kubenswrapper[4580]: E0321 04:52:11.139686 4580 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:11Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:52:11 crc kubenswrapper[4580]: I0321 04:52:11.532021 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:11Z is after 2026-02-23T05:33:13Z Mar 21 04:52:11 crc kubenswrapper[4580]: W0321 04:52:11.941832 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:11Z is after 2026-02-23T05:33:13Z Mar 21 04:52:11 crc kubenswrapper[4580]: E0321 04:52:11.941959 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:52:12 crc kubenswrapper[4580]: I0321 04:52:12.529871 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:12Z is after 2026-02-23T05:33:13Z Mar 21 04:52:12 crc kubenswrapper[4580]: I0321 04:52:12.874651 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 04:52:12 crc kubenswrapper[4580]: I0321 04:52:12.875196 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:52:12 crc kubenswrapper[4580]: I0321 04:52:12.876797 4580 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="65085743d7ee152a030b3a8764759c08e2b7178f0f1e3c65a908a8c84227b881" exitCode=255 Mar 21 04:52:12 crc kubenswrapper[4580]: I0321 04:52:12.876806 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"65085743d7ee152a030b3a8764759c08e2b7178f0f1e3c65a908a8c84227b881"} Mar 21 04:52:12 crc kubenswrapper[4580]: I0321 04:52:12.876862 4580 scope.go:117] "RemoveContainer" containerID="213c6932301a1f4cee16e4949d1dd345e16bfcb9c6c32ffaa0c619c1fa0f1406" Mar 21 04:52:12 crc kubenswrapper[4580]: I0321 04:52:12.877033 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:12 crc kubenswrapper[4580]: I0321 04:52:12.877719 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:12 crc kubenswrapper[4580]: I0321 04:52:12.877839 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:12 crc kubenswrapper[4580]: I0321 04:52:12.877944 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:12 crc kubenswrapper[4580]: I0321 04:52:12.878524 4580 scope.go:117] "RemoveContainer" containerID="65085743d7ee152a030b3a8764759c08e2b7178f0f1e3c65a908a8c84227b881" Mar 21 04:52:12 crc kubenswrapper[4580]: E0321 04:52:12.878913 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:52:13 crc kubenswrapper[4580]: W0321 04:52:13.233244 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:13Z is after 2026-02-23T05:33:13Z Mar 21 04:52:13 crc kubenswrapper[4580]: E0321 04:52:13.233330 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:13Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:52:13 crc kubenswrapper[4580]: I0321 04:52:13.341505 4580 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:52:13 crc kubenswrapper[4580]: I0321 04:52:13.341580 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:52:13 crc kubenswrapper[4580]: I0321 04:52:13.531148 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:13Z is after 2026-02-23T05:33:13Z Mar 21 04:52:13 crc kubenswrapper[4580]: I0321 04:52:13.569191 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:13 crc kubenswrapper[4580]: I0321 04:52:13.880842 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 04:52:13 crc kubenswrapper[4580]: I0321 04:52:13.883008 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:13 crc kubenswrapper[4580]: I0321 04:52:13.883867 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:13 crc kubenswrapper[4580]: I0321 04:52:13.883933 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:13 crc kubenswrapper[4580]: I0321 04:52:13.883959 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:13 crc kubenswrapper[4580]: I0321 04:52:13.884814 4580 scope.go:117] "RemoveContainer" containerID="65085743d7ee152a030b3a8764759c08e2b7178f0f1e3c65a908a8c84227b881" Mar 21 04:52:13 crc kubenswrapper[4580]: E0321 04:52:13.885115 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:52:14 crc kubenswrapper[4580]: I0321 04:52:14.530699 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:14Z is after 2026-02-23T05:33:13Z Mar 21 04:52:14 crc kubenswrapper[4580]: I0321 04:52:14.809843 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:14 crc kubenswrapper[4580]: I0321 04:52:14.886591 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:14 crc kubenswrapper[4580]: I0321 04:52:14.888221 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:14 crc kubenswrapper[4580]: I0321 04:52:14.888286 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:14 crc kubenswrapper[4580]: I0321 04:52:14.888298 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:14 crc kubenswrapper[4580]: I0321 04:52:14.889288 4580 scope.go:117] "RemoveContainer" containerID="65085743d7ee152a030b3a8764759c08e2b7178f0f1e3c65a908a8c84227b881" Mar 21 04:52:14 crc kubenswrapper[4580]: E0321 04:52:14.889590 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:52:15 crc kubenswrapper[4580]: I0321 04:52:15.531402 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:15Z is after 2026-02-23T05:33:13Z Mar 21 04:52:15 crc kubenswrapper[4580]: E0321 04:52:15.679146 4580 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:52:15 crc kubenswrapper[4580]: I0321 04:52:15.956999 4580 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:52:15 crc kubenswrapper[4580]: E0321 04:52:15.960217 4580 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:52:15 crc kubenswrapper[4580]: E0321 04:52:15.961382 4580 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 21 04:52:16 crc kubenswrapper[4580]: W0321 04:52:16.138700 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:16Z is after 2026-02-23T05:33:13Z Mar 21 04:52:16 crc kubenswrapper[4580]: E0321 04:52:16.138844 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:52:16 crc kubenswrapper[4580]: I0321 04:52:16.529707 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:16Z is after 2026-02-23T05:33:13Z Mar 21 04:52:17 crc kubenswrapper[4580]: I0321 04:52:17.532609 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:17Z is after 2026-02-23T05:33:13Z Mar 21 04:52:18 crc kubenswrapper[4580]: E0321 04:52:18.139924 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:18Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 21 04:52:18 crc kubenswrapper[4580]: I0321 04:52:18.139997 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:18 crc kubenswrapper[4580]: I0321 04:52:18.142206 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:18 crc kubenswrapper[4580]: I0321 04:52:18.142287 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:18 crc kubenswrapper[4580]: I0321 04:52:18.142309 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:18 crc kubenswrapper[4580]: I0321 04:52:18.142355 4580 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:52:18 crc kubenswrapper[4580]: E0321 04:52:18.147744 4580 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:18Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:52:18 crc kubenswrapper[4580]: I0321 04:52:18.530183 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:18Z is after 2026-02-23T05:33:13Z Mar 21 04:52:19 crc kubenswrapper[4580]: I0321 04:52:19.530039 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:19Z is after 2026-02-23T05:33:13Z Mar 21 04:52:20 crc kubenswrapper[4580]: W0321 04:52:20.442172 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:20Z is after 2026-02-23T05:33:13Z Mar 21 04:52:20 crc kubenswrapper[4580]: E0321 04:52:20.442281 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:52:20 crc kubenswrapper[4580]: I0321 04:52:20.530871 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:20Z is after 2026-02-23T05:33:13Z Mar 21 04:52:20 crc kubenswrapper[4580]: E0321 04:52:20.752283 4580 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:20Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ec21443e6ebf8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.524895736 +0000 UTC m=+0.607479374,LastTimestamp:2026-03-21 04:51:35.524895736 +0000 UTC m=+0.607479374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:21 crc kubenswrapper[4580]: I0321 04:52:21.530587 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:21Z is after 2026-02-23T05:33:13Z Mar 21 04:52:22 crc kubenswrapper[4580]: I0321 04:52:22.535101 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:23 crc kubenswrapper[4580]: I0321 04:52:23.341039 4580 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:52:23 crc kubenswrapper[4580]: I0321 04:52:23.341151 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:52:23 crc kubenswrapper[4580]: I0321 04:52:23.532180 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:24 crc kubenswrapper[4580]: I0321 04:52:24.286375 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:52:24 crc kubenswrapper[4580]: I0321 04:52:24.286637 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:24 crc kubenswrapper[4580]: I0321 04:52:24.288223 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:24 crc kubenswrapper[4580]: I0321 04:52:24.288305 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:24 crc kubenswrapper[4580]: I0321 04:52:24.288325 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:24 crc kubenswrapper[4580]: I0321 04:52:24.531449 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:25 crc kubenswrapper[4580]: E0321 04:52:25.147903 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:52:25 crc kubenswrapper[4580]: I0321 04:52:25.148923 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:25 crc kubenswrapper[4580]: I0321 04:52:25.150288 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:25 crc kubenswrapper[4580]: I0321 04:52:25.150358 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:25 crc kubenswrapper[4580]: I0321 04:52:25.150392 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:25 crc kubenswrapper[4580]: I0321 04:52:25.150424 4580 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:52:25 crc kubenswrapper[4580]: E0321 04:52:25.157764 4580 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:52:25 crc kubenswrapper[4580]: I0321 04:52:25.536202 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:25 crc kubenswrapper[4580]: E0321 04:52:25.679350 4580 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:52:26 crc kubenswrapper[4580]: I0321 04:52:26.535481 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:26 crc kubenswrapper[4580]: I0321 04:52:26.617048 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:26 crc kubenswrapper[4580]: I0321 04:52:26.618283 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:26 crc kubenswrapper[4580]: I0321 04:52:26.618309 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:26 crc kubenswrapper[4580]: I0321 04:52:26.618317 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:26 crc kubenswrapper[4580]: I0321 04:52:26.618766 4580 scope.go:117] "RemoveContainer" containerID="65085743d7ee152a030b3a8764759c08e2b7178f0f1e3c65a908a8c84227b881" Mar 21 04:52:26 crc kubenswrapper[4580]: E0321 04:52:26.618974 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:52:27 crc kubenswrapper[4580]: I0321 04:52:27.531752 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:28 crc kubenswrapper[4580]: I0321 04:52:28.530999 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:29 crc kubenswrapper[4580]: I0321 04:52:29.532462 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:30 crc kubenswrapper[4580]: I0321 04:52:30.530916 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.759606 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21443e6ebf8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.524895736 +0000 UTC m=+0.607479374,LastTimestamp:2026-03-21 04:51:35.524895736 +0000 UTC m=+0.607479374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.765233 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e08c8a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591586954 +0000 UTC m=+0.674170602,LastTimestamp:2026-03-21 04:51:35.591586954 +0000 UTC m=+0.674170602,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.769545 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e0d9b0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591606704 +0000 UTC m=+0.674190342,LastTimestamp:2026-03-21 04:51:35.591606704 +0000 UTC m=+0.674190342,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.773921 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e105e8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591618024 +0000 UTC m=+0.674201662,LastTimestamp:2026-03-21 04:51:35.591618024 +0000 UTC m=+0.674201662,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.778859 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec2144cabbb7b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.672011643 +0000 UTC m=+0.754595271,LastTimestamp:2026-03-21 04:51:35.672011643 +0000 UTC m=+0.754595271,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.784252 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e08c8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e08c8a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591586954 +0000 UTC m=+0.674170602,LastTimestamp:2026-03-21 04:51:35.719380512 +0000 UTC m=+0.801964150,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.788529 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e0d9b0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e0d9b0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591606704 +0000 UTC m=+0.674190342,LastTimestamp:2026-03-21 04:51:35.719397093 +0000 UTC m=+0.801980731,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.793330 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e105e8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e105e8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591618024 +0000 UTC m=+0.674201662,LastTimestamp:2026-03-21 04:51:35.719413153 +0000 UTC m=+0.801996791,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.798570 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e08c8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e08c8a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591586954 +0000 UTC m=+0.674170602,LastTimestamp:2026-03-21 04:51:35.720798649 +0000 UTC m=+0.803382277,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.805109 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e0d9b0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e0d9b0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591606704 +0000 UTC m=+0.674190342,LastTimestamp:2026-03-21 04:51:35.720810339 +0000 UTC m=+0.803393957,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.809550 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e105e8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e105e8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591618024 +0000 UTC m=+0.674201662,LastTimestamp:2026-03-21 04:51:35.72081826 +0000 UTC m=+0.803401888,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.814205 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e08c8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e08c8a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591586954 +0000 UTC m=+0.674170602,LastTimestamp:2026-03-21 04:51:35.721435277 +0000 UTC m=+0.804018905,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.820899 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e0d9b0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e0d9b0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591606704 +0000 UTC m=+0.674190342,LastTimestamp:2026-03-21 04:51:35.721454647 +0000 UTC m=+0.804038275,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.828366 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e105e8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e105e8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591618024 +0000 UTC m=+0.674201662,LastTimestamp:2026-03-21 04:51:35.721464157 +0000 UTC m=+0.804047785,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.833952 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e08c8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e08c8a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591586954 +0000 UTC m=+0.674170602,LastTimestamp:2026-03-21 04:51:35.723156507 +0000 UTC m=+0.805740135,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.840830 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e0d9b0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e0d9b0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591606704 +0000 UTC m=+0.674190342,LastTimestamp:2026-03-21 04:51:35.723167847 +0000 UTC m=+0.805751475,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.852378 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e105e8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e105e8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591618024 +0000 UTC m=+0.674201662,LastTimestamp:2026-03-21 04:51:35.723196688 +0000 UTC m=+0.805780316,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.857295 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e08c8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e08c8a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591586954 +0000 UTC m=+0.674170602,LastTimestamp:2026-03-21 04:51:35.723212638 +0000 UTC m=+0.805796266,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.863326 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e0d9b0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e0d9b0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591606704 +0000 UTC m=+0.674190342,LastTimestamp:2026-03-21 04:51:35.723230608 +0000 UTC m=+0.805814236,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.868468 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e105e8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e105e8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591618024 +0000 UTC m=+0.674201662,LastTimestamp:2026-03-21 04:51:35.723239648 +0000 UTC m=+0.805823276,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.873895 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e08c8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e08c8a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591586954 +0000 UTC m=+0.674170602,LastTimestamp:2026-03-21 04:51:35.724143059 +0000 UTC m=+0.806726687,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.878415 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e0d9b0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e0d9b0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591606704 +0000 UTC m=+0.674190342,LastTimestamp:2026-03-21 04:51:35.724155359 +0000 UTC m=+0.806738987,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.883212 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e105e8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e105e8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591618024 +0000 UTC m=+0.674201662,LastTimestamp:2026-03-21 04:51:35.724165569 +0000 UTC m=+0.806749197,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.889309 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e08c8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e08c8a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591586954 +0000 UTC m=+0.674170602,LastTimestamp:2026-03-21 04:51:35.724351531 +0000 UTC m=+0.806935159,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.895716 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec21447e0d9b0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec21447e0d9b0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:35.591606704 +0000 UTC m=+0.674190342,LastTimestamp:2026-03-21 04:51:35.724362091 +0000 UTC m=+0.806945719,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.906636 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec2146967109b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:36.154050715 +0000 UTC m=+1.236634343,LastTimestamp:2026-03-21 04:51:36.154050715 +0000 UTC m=+1.236634343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.913264 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec2146974cc96 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:36.154950806 +0000 UTC m=+1.237534434,LastTimestamp:2026-03-21 04:51:36.154950806 +0000 UTC m=+1.237534434,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.921164 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec2146975490a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:36.154982666 +0000 UTC m=+1.237566294,LastTimestamp:2026-03-21 04:51:36.154982666 +0000 UTC m=+1.237566294,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.925920 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec2146a0c6427 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:36.164885543 +0000 UTC m=+1.247469171,LastTimestamp:2026-03-21 04:51:36.164885543 +0000 UTC m=+1.247469171,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.931445 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec2146bc23e4e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:36.193580622 +0000 UTC m=+1.276164290,LastTimestamp:2026-03-21 04:51:36.193580622 +0000 UTC m=+1.276164290,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.936161 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec2149d3552fb openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.023206139 +0000 UTC m=+2.105789767,LastTimestamp:2026-03-21 04:51:37.023206139 +0000 UTC m=+2.105789767,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.940577 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec2149d373cc0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.02333152 +0000 UTC m=+2.105915148,LastTimestamp:2026-03-21 04:51:37.02333152 +0000 UTC m=+2.105915148,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.944889 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec2149d37fb1b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.023380251 +0000 UTC m=+2.105963879,LastTimestamp:2026-03-21 04:51:37.023380251 +0000 UTC m=+2.105963879,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.950178 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec2149d392a6e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.023457902 +0000 UTC m=+2.106041550,LastTimestamp:2026-03-21 04:51:37.023457902 +0000 UTC m=+2.106041550,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.953879 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec2149d396880 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.023473792 +0000 UTC m=+2.106057420,LastTimestamp:2026-03-21 04:51:37.023473792 +0000 UTC m=+2.106057420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.957681 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec214a24745b8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.108268472 +0000 UTC m=+2.190852100,LastTimestamp:2026-03-21 04:51:37.108268472 +0000 UTC m=+2.190852100,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.962211 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec214a2493207 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.108394503 +0000 UTC m=+2.190978141,LastTimestamp:2026-03-21 04:51:37.108394503 +0000 UTC m=+2.190978141,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.966525 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec214a255cd69 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.109220713 +0000 UTC m=+2.191804351,LastTimestamp:2026-03-21 04:51:37.109220713 +0000 UTC m=+2.191804351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.970710 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec214a261604c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.109979212 +0000 UTC m=+2.192562840,LastTimestamp:2026-03-21 04:51:37.109979212 +0000 UTC m=+2.192562840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.975218 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec214a2649c33 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.110191155 +0000 UTC m=+2.192774803,LastTimestamp:2026-03-21 04:51:37.110191155 +0000 UTC m=+2.192774803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.978573 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec214a28d7604 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.112868356 +0000 UTC m=+2.195451994,LastTimestamp:2026-03-21 04:51:37.112868356 +0000 UTC m=+2.195451994,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.983258 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec214b9f16be1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.505295329 +0000 UTC m=+2.587878987,LastTimestamp:2026-03-21 04:51:37.505295329 +0000 UTC m=+2.587878987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.987636 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec214bfae2ea3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.601552035 +0000 UTC m=+2.684135683,LastTimestamp:2026-03-21 04:51:37.601552035 +0000 UTC m=+2.684135683,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.991154 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec214bfce7cf8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.60366924 +0000 UTC m=+2.686252898,LastTimestamp:2026-03-21 04:51:37.60366924 +0000 UTC m=+2.686252898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.995566 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec214c2b4abdb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.652308955 +0000 UTC m=+2.734892593,LastTimestamp:2026-03-21 04:51:37.652308955 +0000 UTC m=+2.734892593,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:30 crc kubenswrapper[4580]: E0321 04:52:30.999721 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec214c2b4b9eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.652312555 +0000 UTC m=+2.734896203,LastTimestamp:2026-03-21 04:51:37.652312555 +0000 UTC m=+2.734896203,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.003357 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec214c2bd333d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.652867901 +0000 UTC m=+2.735451529,LastTimestamp:2026-03-21 04:51:37.652867901 +0000 UTC m=+2.735451529,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.006986 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec214c2be2a82 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.652931202 +0000 UTC m=+2.735514830,LastTimestamp:2026-03-21 04:51:37.652931202 +0000 UTC m=+2.735514830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.010529 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec214dcd21632 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:38.090444338 +0000 UTC m=+3.173027996,LastTimestamp:2026-03-21 04:51:38.090444338 +0000 UTC m=+3.173027996,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.014942 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec214dfd832bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:38.141176507 +0000 UTC m=+3.223760135,LastTimestamp:2026-03-21 04:51:38.141176507 +0000 UTC m=+3.223760135,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.018827 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec214e401f89a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:38.211023002 +0000 UTC m=+3.293606650,LastTimestamp:2026-03-21 04:51:38.211023002 +0000 UTC m=+3.293606650,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.022177 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec214e40841cc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:38.211434956 +0000 UTC m=+3.294018584,LastTimestamp:2026-03-21 04:51:38.211434956 +0000 UTC m=+3.294018584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.026382 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec214e4101b5b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:38.211949403 +0000 UTC m=+3.294533031,LastTimestamp:2026-03-21 04:51:38.211949403 +0000 UTC m=+3.294533031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.029527 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec214f135e215 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:38.432528917 +0000 UTC m=+3.515112545,LastTimestamp:2026-03-21 04:51:38.432528917 +0000 UTC m=+3.515112545,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.033591 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec214f14b1339 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:38.433917753 +0000 UTC m=+3.516501381,LastTimestamp:2026-03-21 04:51:38.433917753 +0000 UTC m=+3.516501381,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.037284 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec214f8e5370b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:38.561459979 +0000 UTC m=+3.644043607,LastTimestamp:2026-03-21 04:51:38.561459979 +0000 UTC m=+3.644043607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.041636 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec214f8e5fffc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:38.56151142 +0000 UTC m=+3.644095048,LastTimestamp:2026-03-21 04:51:38.56151142 +0000 UTC m=+3.644095048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.045051 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec214f8eeaa4b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:38.562079307 +0000 UTC m=+3.644662945,LastTimestamp:2026-03-21 04:51:38.562079307 +0000 UTC m=+3.644662945,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.048519 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec214f8f51566 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:38.562499942 +0000 UTC m=+3.645083580,LastTimestamp:2026-03-21 04:51:38.562499942 +0000 UTC m=+3.645083580,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.053852 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec214f8f906b5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:38.562758325 +0000 UTC m=+3.645341953,LastTimestamp:2026-03-21 04:51:38.562758325 +0000 UTC m=+3.645341953,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.055749 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec214f907da2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:38.563729966 +0000 UTC m=+3.646313604,LastTimestamp:2026-03-21 04:51:38.563729966 +0000 UTC m=+3.646313604,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.060328 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec214feac5476 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:38.658395254 +0000 UTC m=+3.740978882,LastTimestamp:2026-03-21 04:51:38.658395254 +0000 UTC m=+3.740978882,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.069654 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec21507953e6d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:38.807877229 +0000 UTC m=+3.890460857,LastTimestamp:2026-03-21 04:51:38.807877229 +0000 UTC m=+3.890460857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.074683 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec21518dc9962 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.097766242 +0000 UTC m=+4.180349870,LastTimestamp:2026-03-21 04:51:39.097766242 +0000 UTC m=+4.180349870,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.081798 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec215193f5bf2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.104238578 +0000 UTC m=+4.186822206,LastTimestamp:2026-03-21 04:51:39.104238578 +0000 UTC m=+4.186822206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.085714 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec2151943cf42 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.104530242 +0000 UTC m=+4.187113870,LastTimestamp:2026-03-21 04:51:39.104530242 +0000 UTC m=+4.187113870,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.091199 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec21519458671 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.104642673 +0000 UTC m=+4.187226301,LastTimestamp:2026-03-21 04:51:39.104642673 +0000 UTC m=+4.187226301,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.095329 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec21519be1b34 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.112545076 +0000 UTC m=+4.195128704,LastTimestamp:2026-03-21 04:51:39.112545076 +0000 UTC m=+4.195128704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.099523 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec21519db7759 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.114469209 +0000 UTC m=+4.197052837,LastTimestamp:2026-03-21 04:51:39.114469209 +0000 UTC m=+4.197052837,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.106260 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec2151a709f46 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.124244294 +0000 UTC m=+4.206827942,LastTimestamp:2026-03-21 04:51:39.124244294 +0000 UTC m=+4.206827942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.112509 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec2151a93f46c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.126559852 +0000 UTC m=+4.209143480,LastTimestamp:2026-03-21 04:51:39.126559852 +0000 UTC m=+4.209143480,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.118318 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec2151c7cb8f2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.15859173 +0000 UTC m=+4.241175358,LastTimestamp:2026-03-21 04:51:39.15859173 +0000 UTC m=+4.241175358,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.123034 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec215252a912b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.304202539 +0000 UTC m=+4.386786167,LastTimestamp:2026-03-21 04:51:39.304202539 +0000 UTC m=+4.386786167,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.133195 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec215260fd789 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.319228297 +0000 UTC m=+4.401811925,LastTimestamp:2026-03-21 04:51:39.319228297 +0000 UTC m=+4.401811925,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.138595 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec21526716dbe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.325623742 +0000 UTC m=+4.408207370,LastTimestamp:2026-03-21 04:51:39.325623742 +0000 UTC m=+4.408207370,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.142063 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec215275ee426 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.341186086 +0000 UTC m=+4.423769714,LastTimestamp:2026-03-21 04:51:39.341186086 +0000 UTC m=+4.423769714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.145506 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec21527730b78 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.342506872 +0000 UTC m=+4.425090500,LastTimestamp:2026-03-21 04:51:39.342506872 +0000 UTC m=+4.425090500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.149354 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec21533082b20 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.536829216 +0000 UTC m=+4.619412844,LastTimestamp:2026-03-21 04:51:39.536829216 +0000 UTC m=+4.619412844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.153702 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec21533c2d515 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.549062421 +0000 UTC m=+4.631646059,LastTimestamp:2026-03-21 04:51:39.549062421 +0000 UTC m=+4.631646059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.158056 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec21533d8e866 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.550509158 +0000 UTC m=+4.633092786,LastTimestamp:2026-03-21 04:51:39.550509158 +0000 UTC m=+4.633092786,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.160737 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec2153ad4f081 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.667689601 +0000 UTC m=+4.750273229,LastTimestamp:2026-03-21 04:51:39.667689601 +0000 UTC m=+4.750273229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.162258 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec215406fafe5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.761717221 +0000 UTC m=+4.844300839,LastTimestamp:2026-03-21 04:51:39.761717221 +0000 UTC m=+4.844300839,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.166797 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec2154246bfac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.792588716 +0000 UTC m=+4.875172344,LastTimestamp:2026-03-21 04:51:39.792588716 +0000 UTC m=+4.875172344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.172467 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec21549670604 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.912144388 +0000 UTC m=+4.994728016,LastTimestamp:2026-03-21 04:51:39.912144388 +0000 UTC m=+4.994728016,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.176997 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec2154a26da2c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.924716076 +0000 UTC m=+5.007299704,LastTimestamp:2026-03-21 04:51:39.924716076 +0000 UTC m=+5.007299704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.181375 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec2154a45e614 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.92675074 +0000 UTC m=+5.009334358,LastTimestamp:2026-03-21 04:51:39.92675074 +0000 UTC m=+5.009334358,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.184801 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec2155520e7d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:40.108875731 +0000 UTC m=+5.191459359,LastTimestamp:2026-03-21 04:51:40.108875731 +0000 UTC m=+5.191459359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.188652 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec215565c5cb7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:40.129549495 +0000 UTC m=+5.212133123,LastTimestamp:2026-03-21 04:51:40.129549495 +0000 UTC m=+5.212133123,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.191967 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec215567105af openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:40.130903471 +0000 UTC m=+5.213487099,LastTimestamp:2026-03-21 04:51:40.130903471 +0000 UTC m=+5.213487099,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.197307 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec21563d50743 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:40.355561283 +0000 UTC m=+5.438144911,LastTimestamp:2026-03-21 04:51:40.355561283 +0000 UTC m=+5.438144911,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.200895 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec21564669562 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:40.365100386 +0000 UTC m=+5.447684014,LastTimestamp:2026-03-21 04:51:40.365100386 +0000 UTC m=+5.447684014,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.204027 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec2156477ea4f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:40.366236239 +0000 UTC m=+5.448819867,LastTimestamp:2026-03-21 04:51:40.366236239 +0000 UTC m=+5.448819867,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.207753 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec2156e5fdffc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:40.532432892 +0000 UTC m=+5.615016520,LastTimestamp:2026-03-21 04:51:40.532432892 +0000 UTC m=+5.615016520,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.211269 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec2156f4d3038 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:40.547985464 +0000 UTC m=+5.630569092,LastTimestamp:2026-03-21 04:51:40.547985464 +0000 UTC m=+5.630569092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.215360 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec2156f6312b7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:40.549419703 +0000 UTC m=+5.632003331,LastTimestamp:2026-03-21 04:51:40.549419703 +0000 UTC m=+5.632003331,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.219452 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec21579d78cbd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:40.724825277 +0000 UTC m=+5.807408905,LastTimestamp:2026-03-21 04:51:40.724825277 +0000 UTC m=+5.807408905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.223749 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec2157a7be971 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:40.735596913 +0000 UTC m=+5.818180541,LastTimestamp:2026-03-21 04:51:40.735596913 +0000 UTC m=+5.818180541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.229155 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:52:31 crc kubenswrapper[4580]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec21615c09502 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:52:31 crc kubenswrapper[4580]: body: Mar 21 04:52:31 crc kubenswrapper[4580]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:43.340565762 +0000 UTC m=+8.423149420,LastTimestamp:2026-03-21 04:51:43.340565762 +0000 UTC m=+8.423149420,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:52:31 crc kubenswrapper[4580]: > Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.233171 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec21615c2669b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:43.340684955 +0000 UTC m=+8.423268643,LastTimestamp:2026-03-21 04:51:43.340684955 +0000 UTC m=+8.423268643,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.239113 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 21 04:52:31 crc kubenswrapper[4580]: &Event{ObjectMeta:{kube-apiserver-crc.189ec217ce4f1147 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 21 04:52:31 crc kubenswrapper[4580]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 21 04:52:31 crc kubenswrapper[4580]: Mar 21 04:52:31 crc kubenswrapper[4580]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:50.731878727 +0000 UTC m=+15.814462355,LastTimestamp:2026-03-21 04:51:50.731878727 +0000 UTC m=+15.814462355,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:52:31 crc kubenswrapper[4580]: > Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.243465 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec217ce4fb92c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:50.731921708 +0000 UTC m=+15.814505346,LastTimestamp:2026-03-21 04:51:50.731921708 +0000 UTC m=+15.814505346,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.247145 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 21 04:52:31 crc kubenswrapper[4580]: &Event{ObjectMeta:{kube-apiserver-crc.189ec217cefa32eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Mar 21 04:52:31 crc kubenswrapper[4580]: body: [+]ping ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]log ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]etcd ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/generic-apiserver-start-informers ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/priority-and-fairness-filter ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/start-apiextensions-informers ok Mar 21 04:52:31 crc kubenswrapper[4580]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 21 04:52:31 crc kubenswrapper[4580]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/start-system-namespaces-controller ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 21 04:52:31 crc kubenswrapper[4580]: [-]poststarthook/start-service-ip-repair-controllers failed: reason withheld Mar 21 04:52:31 crc kubenswrapper[4580]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 21 04:52:31 crc kubenswrapper[4580]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 21 04:52:31 crc kubenswrapper[4580]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Mar 21 04:52:31 crc kubenswrapper[4580]: [-]poststarthook/bootstrap-controller failed: reason withheld Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/start-kube-aggregator-informers ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 21 04:52:31 crc kubenswrapper[4580]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 21 04:52:31 crc kubenswrapper[4580]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 21 04:52:31 crc kubenswrapper[4580]: [-]autoregister-completion failed: reason withheld Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/apiservice-openapi-controller ok Mar 21 04:52:31 crc kubenswrapper[4580]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 21 04:52:31 crc kubenswrapper[4580]: livez check failed Mar 21 04:52:31 crc kubenswrapper[4580]: Mar 21 04:52:31 crc kubenswrapper[4580]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:50.743093995 +0000 UTC m=+15.825677623,LastTimestamp:2026-03-21 04:51:50.743093995 +0000 UTC m=+15.825677623,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:52:31 crc kubenswrapper[4580]: > Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.250380 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec217cefb02d0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:50.743147216 +0000 UTC m=+15.825730844,LastTimestamp:2026-03-21 04:51:50.743147216 +0000 UTC m=+15.825730844,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.255031 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ec21533d8e866\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec21533d8e866 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.550509158 +0000 UTC m=+4.633092786,LastTimestamp:2026-03-21 04:51:51.772577968 +0000 UTC m=+16.855161586,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.259212 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ec215406fafe5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec215406fafe5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.761717221 +0000 UTC m=+4.844300839,LastTimestamp:2026-03-21 04:51:51.951314602 +0000 UTC m=+17.033898230,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.264007 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ec2154246bfac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec2154246bfac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:39.792588716 +0000 UTC m=+4.875172344,LastTimestamp:2026-03-21 04:51:51.960830304 +0000 UTC m=+17.043413932,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.268326 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec21615c09502\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:52:31 crc kubenswrapper[4580]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec21615c09502 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:52:31 crc kubenswrapper[4580]: body: Mar 21 04:52:31 crc kubenswrapper[4580]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:43.340565762 +0000 UTC m=+8.423149420,LastTimestamp:2026-03-21 04:51:53.341440009 +0000 UTC m=+18.424023657,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:52:31 crc kubenswrapper[4580]: > Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.272296 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec21615c2669b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec21615c2669b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:43.340684955 +0000 UTC m=+8.423268643,LastTimestamp:2026-03-21 04:51:53.341524231 +0000 UTC m=+18.424107879,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.278253 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec21615c09502\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:52:31 crc kubenswrapper[4580]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec21615c09502 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:52:31 crc kubenswrapper[4580]: body: Mar 21 04:52:31 crc kubenswrapper[4580]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:43.340565762 +0000 UTC m=+8.423149420,LastTimestamp:2026-03-21 04:52:03.340615797 +0000 UTC m=+28.423199425,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:52:31 crc kubenswrapper[4580]: > Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.282484 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec21615c2669b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec21615c2669b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:43.340684955 +0000 UTC m=+8.423268643,LastTimestamp:2026-03-21 04:52:03.340662669 +0000 UTC m=+28.423246297,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.287079 4580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec21abe2c27a2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:52:03.346057122 +0000 UTC m=+28.428640770,LastTimestamp:2026-03-21 04:52:03.346057122 +0000 UTC m=+28.428640770,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.291575 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec214a28d7604\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec214a28d7604 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.112868356 +0000 UTC m=+2.195451994,LastTimestamp:2026-03-21 04:52:03.468323917 +0000 UTC m=+28.550907545,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.296905 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec214b9f16be1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec214b9f16be1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.505295329 +0000 UTC m=+2.587878987,LastTimestamp:2026-03-21 04:52:03.628317344 +0000 UTC m=+28.710900992,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.300765 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec214bfae2ea3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec214bfae2ea3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:37.601552035 +0000 UTC m=+2.684135683,LastTimestamp:2026-03-21 04:52:03.637445786 +0000 UTC m=+28.720029414,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.306718 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec21615c09502\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:52:31 crc kubenswrapper[4580]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec21615c09502 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:52:31 crc kubenswrapper[4580]: body: Mar 21 04:52:31 crc kubenswrapper[4580]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:43.340565762 +0000 UTC m=+8.423149420,LastTimestamp:2026-03-21 04:52:13.341563443 +0000 UTC m=+38.424147071,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:52:31 crc kubenswrapper[4580]: > Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.310814 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec21615c2669b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec21615c2669b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:43.340684955 +0000 UTC m=+8.423268643,LastTimestamp:2026-03-21 04:52:13.341605734 +0000 UTC m=+38.424189362,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:31 crc kubenswrapper[4580]: E0321 04:52:31.315662 4580 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec21615c09502\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:52:31 crc kubenswrapper[4580]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec21615c09502 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:52:31 crc kubenswrapper[4580]: body: Mar 21 04:52:31 crc kubenswrapper[4580]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:51:43.340565762 +0000 UTC m=+8.423149420,LastTimestamp:2026-03-21 04:52:23.341125774 +0000 UTC m=+48.423709402,Count:5,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:52:31 crc kubenswrapper[4580]: > Mar 21 04:52:31 crc kubenswrapper[4580]: I0321 04:52:31.531914 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:32 crc kubenswrapper[4580]: E0321 04:52:32.155165 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.158281 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.160010 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.160050 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.160066 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.160092 4580 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:52:32 crc kubenswrapper[4580]: E0321 04:52:32.167336 4580 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.266380 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.266866 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.269594 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.269653 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.269674 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.271614 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.534125 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.932196 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.933185 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.933226 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:32 crc kubenswrapper[4580]: I0321 04:52:32.933236 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:33 crc kubenswrapper[4580]: I0321 04:52:33.533435 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:34 crc kubenswrapper[4580]: I0321 04:52:34.532579 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:35 crc kubenswrapper[4580]: I0321 04:52:35.532901 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:35 crc kubenswrapper[4580]: E0321 04:52:35.679490 4580 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:52:36 crc kubenswrapper[4580]: I0321 04:52:36.533943 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:37 crc kubenswrapper[4580]: I0321 04:52:37.533137 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:38 crc kubenswrapper[4580]: I0321 04:52:38.531579 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:38 crc kubenswrapper[4580]: I0321 04:52:38.617497 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:38 crc kubenswrapper[4580]: I0321 04:52:38.619106 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:38 crc kubenswrapper[4580]: I0321 04:52:38.619169 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:38 crc kubenswrapper[4580]: I0321 04:52:38.619183 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:38 crc kubenswrapper[4580]: I0321 04:52:38.620156 4580 scope.go:117] "RemoveContainer" containerID="65085743d7ee152a030b3a8764759c08e2b7178f0f1e3c65a908a8c84227b881" Mar 21 04:52:38 crc kubenswrapper[4580]: I0321 04:52:38.950542 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 04:52:38 crc kubenswrapper[4580]: I0321 04:52:38.951648 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4dac441b0b921acc01bd50e5e9a93d3f38fcdd018d7177f99a0b8764e536fb8b"} Mar 21 04:52:38 crc kubenswrapper[4580]: I0321 04:52:38.951770 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:38 crc kubenswrapper[4580]: I0321 04:52:38.952530 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:38 crc kubenswrapper[4580]: I0321 04:52:38.952556 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:38 crc kubenswrapper[4580]: I0321 04:52:38.952566 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:39 crc kubenswrapper[4580]: E0321 04:52:39.160793 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.167860 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.168911 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.168939 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.168951 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.168990 4580 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:52:39 crc kubenswrapper[4580]: E0321 04:52:39.173280 4580 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.531505 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.958281 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.959675 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.962880 4580 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4dac441b0b921acc01bd50e5e9a93d3f38fcdd018d7177f99a0b8764e536fb8b" exitCode=255 Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.962930 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4dac441b0b921acc01bd50e5e9a93d3f38fcdd018d7177f99a0b8764e536fb8b"} Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.962979 4580 scope.go:117] "RemoveContainer" containerID="65085743d7ee152a030b3a8764759c08e2b7178f0f1e3c65a908a8c84227b881" Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.964574 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.965769 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.965835 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.965846 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:39 crc kubenswrapper[4580]: I0321 04:52:39.966639 4580 scope.go:117] "RemoveContainer" containerID="4dac441b0b921acc01bd50e5e9a93d3f38fcdd018d7177f99a0b8764e536fb8b" Mar 21 04:52:39 crc kubenswrapper[4580]: E0321 04:52:39.966940 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:52:40 crc kubenswrapper[4580]: I0321 04:52:40.530461 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:40 crc kubenswrapper[4580]: I0321 04:52:40.966580 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:52:41 crc kubenswrapper[4580]: I0321 04:52:41.530439 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:42 crc kubenswrapper[4580]: I0321 04:52:42.531521 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:43 crc kubenswrapper[4580]: I0321 04:52:43.532773 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:43 crc kubenswrapper[4580]: I0321 04:52:43.569542 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:43 crc kubenswrapper[4580]: I0321 04:52:43.571045 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:43 crc kubenswrapper[4580]: I0321 04:52:43.572600 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:43 crc kubenswrapper[4580]: I0321 04:52:43.572678 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:43 crc kubenswrapper[4580]: I0321 04:52:43.572692 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:43 crc kubenswrapper[4580]: I0321 04:52:43.574193 4580 scope.go:117] "RemoveContainer" containerID="4dac441b0b921acc01bd50e5e9a93d3f38fcdd018d7177f99a0b8764e536fb8b" Mar 21 04:52:43 crc kubenswrapper[4580]: E0321 04:52:43.574404 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:52:43 crc kubenswrapper[4580]: W0321 04:52:43.816563 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 21 04:52:43 crc kubenswrapper[4580]: E0321 04:52:43.816643 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 21 04:52:44 crc kubenswrapper[4580]: I0321 04:52:44.531893 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:44 crc kubenswrapper[4580]: I0321 04:52:44.810334 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:44 crc kubenswrapper[4580]: I0321 04:52:44.810623 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:44 crc kubenswrapper[4580]: I0321 04:52:44.812456 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:44 crc kubenswrapper[4580]: I0321 04:52:44.812532 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:44 crc kubenswrapper[4580]: I0321 04:52:44.812570 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:44 crc kubenswrapper[4580]: I0321 04:52:44.813375 4580 scope.go:117] "RemoveContainer" containerID="4dac441b0b921acc01bd50e5e9a93d3f38fcdd018d7177f99a0b8764e536fb8b" Mar 21 04:52:44 crc kubenswrapper[4580]: E0321 04:52:44.813606 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:52:45 crc kubenswrapper[4580]: W0321 04:52:45.492356 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:45 crc kubenswrapper[4580]: E0321 04:52:45.492439 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 21 04:52:45 crc kubenswrapper[4580]: I0321 04:52:45.533374 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:45 crc kubenswrapper[4580]: E0321 04:52:45.680255 4580 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:52:46 crc kubenswrapper[4580]: E0321 04:52:46.167270 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:52:46 crc kubenswrapper[4580]: I0321 04:52:46.174103 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:46 crc kubenswrapper[4580]: I0321 04:52:46.175841 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:46 crc kubenswrapper[4580]: I0321 04:52:46.175963 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:46 crc kubenswrapper[4580]: I0321 04:52:46.176045 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:46 crc kubenswrapper[4580]: I0321 04:52:46.176164 4580 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:52:46 crc kubenswrapper[4580]: E0321 04:52:46.181378 4580 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:52:46 crc kubenswrapper[4580]: I0321 04:52:46.531512 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:46 crc kubenswrapper[4580]: I0321 04:52:46.616827 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:46 crc kubenswrapper[4580]: I0321 04:52:46.618006 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:46 crc kubenswrapper[4580]: I0321 04:52:46.618045 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:46 crc kubenswrapper[4580]: I0321 04:52:46.618057 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:47 crc kubenswrapper[4580]: I0321 04:52:47.532923 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:47 crc kubenswrapper[4580]: I0321 04:52:47.963506 4580 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:52:47 crc kubenswrapper[4580]: I0321 04:52:47.978317 4580 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 21 04:52:48 crc kubenswrapper[4580]: I0321 04:52:48.534081 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:49 crc kubenswrapper[4580]: W0321 04:52:49.386898 4580 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 21 04:52:49 crc kubenswrapper[4580]: E0321 04:52:49.386968 4580 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 21 04:52:49 crc kubenswrapper[4580]: I0321 04:52:49.534446 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:50 crc kubenswrapper[4580]: I0321 04:52:50.531615 4580 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:52:50 crc kubenswrapper[4580]: I0321 04:52:50.550661 4580 csr.go:261] certificate signing request csr-k55rf is approved, waiting to be issued Mar 21 04:52:50 crc kubenswrapper[4580]: I0321 04:52:50.560214 4580 csr.go:257] certificate signing request csr-k55rf is issued Mar 21 04:52:50 crc kubenswrapper[4580]: I0321 04:52:50.667175 4580 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 21 04:52:51 crc kubenswrapper[4580]: I0321 04:52:51.388770 4580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 21 04:52:51 crc kubenswrapper[4580]: I0321 04:52:51.562278 4580 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-05 12:55:36.25278045 +0000 UTC Mar 21 04:52:51 crc kubenswrapper[4580]: I0321 04:52:51.562355 4580 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6968h2m44.690432901s for next certificate rotation Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.182297 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.184771 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.184838 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.184855 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.184999 4580 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.196070 4580 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.196406 4580 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 21 04:52:53 crc kubenswrapper[4580]: E0321 04:52:53.196436 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.201968 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.202216 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.202302 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.202412 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.202532 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:52:53Z","lastTransitionTime":"2026-03-21T04:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:52:53 crc kubenswrapper[4580]: E0321 04:52:53.228297 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.237363 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.237847 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.237923 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.238009 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.238074 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:52:53Z","lastTransitionTime":"2026-03-21T04:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:52:53 crc kubenswrapper[4580]: E0321 04:52:53.253402 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.264431 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.264491 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.264503 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.264526 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.264543 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:52:53Z","lastTransitionTime":"2026-03-21T04:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:52:53 crc kubenswrapper[4580]: E0321 04:52:53.276879 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.285767 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.285849 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.285862 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.285889 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:52:53 crc kubenswrapper[4580]: I0321 04:52:53.285903 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:52:53Z","lastTransitionTime":"2026-03-21T04:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:52:53 crc kubenswrapper[4580]: E0321 04:52:53.299721 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4580]: E0321 04:52:53.299901 4580 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:52:53 crc kubenswrapper[4580]: E0321 04:52:53.299965 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:53 crc kubenswrapper[4580]: E0321 04:52:53.400495 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:53 crc kubenswrapper[4580]: E0321 04:52:53.500840 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:53 crc kubenswrapper[4580]: E0321 04:52:53.601943 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:53 crc kubenswrapper[4580]: E0321 04:52:53.702950 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:53 crc kubenswrapper[4580]: E0321 04:52:53.803713 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:53 crc kubenswrapper[4580]: E0321 04:52:53.904325 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:54 crc kubenswrapper[4580]: E0321 04:52:54.005078 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:54 crc kubenswrapper[4580]: E0321 04:52:54.105264 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:54 crc kubenswrapper[4580]: E0321 04:52:54.206647 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:54 crc kubenswrapper[4580]: E0321 04:52:54.306840 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:54 crc kubenswrapper[4580]: E0321 04:52:54.407980 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:54 crc kubenswrapper[4580]: E0321 04:52:54.508964 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:54 crc kubenswrapper[4580]: E0321 04:52:54.609488 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:54 crc kubenswrapper[4580]: E0321 04:52:54.710187 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:54 crc kubenswrapper[4580]: E0321 04:52:54.810345 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:54 crc kubenswrapper[4580]: E0321 04:52:54.910731 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:55 crc kubenswrapper[4580]: E0321 04:52:55.010909 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:55 crc kubenswrapper[4580]: E0321 04:52:55.111811 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:55 crc kubenswrapper[4580]: E0321 04:52:55.212523 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:55 crc kubenswrapper[4580]: E0321 04:52:55.312927 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:55 crc kubenswrapper[4580]: E0321 04:52:55.413538 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:55 crc kubenswrapper[4580]: E0321 04:52:55.514327 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:55 crc kubenswrapper[4580]: E0321 04:52:55.614647 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:55 crc kubenswrapper[4580]: E0321 04:52:55.680542 4580 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:52:55 crc kubenswrapper[4580]: E0321 04:52:55.714981 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:55 crc kubenswrapper[4580]: E0321 04:52:55.815929 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:55 crc kubenswrapper[4580]: E0321 04:52:55.916307 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:56 crc kubenswrapper[4580]: E0321 04:52:56.016681 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:56 crc kubenswrapper[4580]: E0321 04:52:56.117870 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:56 crc kubenswrapper[4580]: E0321 04:52:56.219084 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:56 crc kubenswrapper[4580]: E0321 04:52:56.319987 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:56 crc kubenswrapper[4580]: E0321 04:52:56.420187 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:56 crc kubenswrapper[4580]: E0321 04:52:56.520898 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:56 crc kubenswrapper[4580]: E0321 04:52:56.621308 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:56 crc kubenswrapper[4580]: E0321 04:52:56.722472 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:56 crc kubenswrapper[4580]: E0321 04:52:56.823592 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:56 crc kubenswrapper[4580]: E0321 04:52:56.924611 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:57 crc kubenswrapper[4580]: E0321 04:52:57.025336 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:57 crc kubenswrapper[4580]: E0321 04:52:57.125843 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:57 crc kubenswrapper[4580]: E0321 04:52:57.226979 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:57 crc kubenswrapper[4580]: E0321 04:52:57.328195 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:57 crc kubenswrapper[4580]: E0321 04:52:57.428898 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:57 crc kubenswrapper[4580]: E0321 04:52:57.529810 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:57 crc kubenswrapper[4580]: E0321 04:52:57.630952 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:57 crc kubenswrapper[4580]: E0321 04:52:57.731158 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:57 crc kubenswrapper[4580]: E0321 04:52:57.831303 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:57 crc kubenswrapper[4580]: E0321 04:52:57.931566 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:58 crc kubenswrapper[4580]: E0321 04:52:58.032358 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:58 crc kubenswrapper[4580]: E0321 04:52:58.133367 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:58 crc kubenswrapper[4580]: E0321 04:52:58.233496 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:58 crc kubenswrapper[4580]: E0321 04:52:58.334650 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:58 crc kubenswrapper[4580]: E0321 04:52:58.438504 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:58 crc kubenswrapper[4580]: E0321 04:52:58.539139 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:58 crc kubenswrapper[4580]: E0321 04:52:58.640230 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:58 crc kubenswrapper[4580]: E0321 04:52:58.741245 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:58 crc kubenswrapper[4580]: E0321 04:52:58.841464 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:58 crc kubenswrapper[4580]: E0321 04:52:58.942228 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:59 crc kubenswrapper[4580]: E0321 04:52:59.042948 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:59 crc kubenswrapper[4580]: E0321 04:52:59.143150 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:59 crc kubenswrapper[4580]: E0321 04:52:59.244105 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:59 crc kubenswrapper[4580]: E0321 04:52:59.344258 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:59 crc kubenswrapper[4580]: E0321 04:52:59.445389 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:59 crc kubenswrapper[4580]: E0321 04:52:59.546035 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:59 crc kubenswrapper[4580]: I0321 04:52:59.617375 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:52:59 crc kubenswrapper[4580]: I0321 04:52:59.619371 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:52:59 crc kubenswrapper[4580]: I0321 04:52:59.619427 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:52:59 crc kubenswrapper[4580]: I0321 04:52:59.619444 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:52:59 crc kubenswrapper[4580]: I0321 04:52:59.620265 4580 scope.go:117] "RemoveContainer" containerID="4dac441b0b921acc01bd50e5e9a93d3f38fcdd018d7177f99a0b8764e536fb8b" Mar 21 04:52:59 crc kubenswrapper[4580]: E0321 04:52:59.620453 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:52:59 crc kubenswrapper[4580]: E0321 04:52:59.646915 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:59 crc kubenswrapper[4580]: E0321 04:52:59.748030 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:59 crc kubenswrapper[4580]: E0321 04:52:59.848375 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:52:59 crc kubenswrapper[4580]: E0321 04:52:59.949268 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:00 crc kubenswrapper[4580]: E0321 04:53:00.050366 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:00 crc kubenswrapper[4580]: E0321 04:53:00.151011 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:00 crc kubenswrapper[4580]: E0321 04:53:00.251987 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:00 crc kubenswrapper[4580]: E0321 04:53:00.352173 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:00 crc kubenswrapper[4580]: E0321 04:53:00.452720 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:00 crc kubenswrapper[4580]: E0321 04:53:00.553883 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:00 crc kubenswrapper[4580]: E0321 04:53:00.654110 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:00 crc kubenswrapper[4580]: E0321 04:53:00.755189 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:00 crc kubenswrapper[4580]: E0321 04:53:00.855714 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:00 crc kubenswrapper[4580]: E0321 04:53:00.956467 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:01 crc kubenswrapper[4580]: E0321 04:53:01.057163 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:01 crc kubenswrapper[4580]: E0321 04:53:01.157580 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:01 crc kubenswrapper[4580]: E0321 04:53:01.258033 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:01 crc kubenswrapper[4580]: E0321 04:53:01.359006 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:01 crc kubenswrapper[4580]: E0321 04:53:01.459568 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:01 crc kubenswrapper[4580]: E0321 04:53:01.560181 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:01 crc kubenswrapper[4580]: E0321 04:53:01.660887 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:01 crc kubenswrapper[4580]: E0321 04:53:01.761760 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:01 crc kubenswrapper[4580]: E0321 04:53:01.862653 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:01 crc kubenswrapper[4580]: E0321 04:53:01.963682 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:02 crc kubenswrapper[4580]: E0321 04:53:02.064735 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:02 crc kubenswrapper[4580]: E0321 04:53:02.165158 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:02 crc kubenswrapper[4580]: E0321 04:53:02.266370 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:02 crc kubenswrapper[4580]: E0321 04:53:02.367233 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:02 crc kubenswrapper[4580]: E0321 04:53:02.468400 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:02 crc kubenswrapper[4580]: E0321 04:53:02.569008 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:02 crc kubenswrapper[4580]: E0321 04:53:02.669539 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:02 crc kubenswrapper[4580]: E0321 04:53:02.770603 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:02 crc kubenswrapper[4580]: E0321 04:53:02.871682 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:02 crc kubenswrapper[4580]: E0321 04:53:02.972254 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.073167 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.173908 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.186155 4580 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.274764 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.375653 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.476000 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.526928 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.531978 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.532027 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.532037 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.532054 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.532069 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:03Z","lastTransitionTime":"2026-03-21T04:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.541404 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.545894 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.545939 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.545953 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.545980 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.545995 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:03Z","lastTransitionTime":"2026-03-21T04:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.558929 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.563967 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.564034 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.564045 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.564065 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.564076 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:03Z","lastTransitionTime":"2026-03-21T04:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.574918 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.579515 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.579564 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.579577 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.579604 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:03 crc kubenswrapper[4580]: I0321 04:53:03.579621 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:03Z","lastTransitionTime":"2026-03-21T04:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.592681 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.592849 4580 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.592889 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.693543 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.794545 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.895653 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:03 crc kubenswrapper[4580]: E0321 04:53:03.996332 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:04 crc kubenswrapper[4580]: E0321 04:53:04.096802 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:04 crc kubenswrapper[4580]: E0321 04:53:04.197178 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:04 crc kubenswrapper[4580]: E0321 04:53:04.297626 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:04 crc kubenswrapper[4580]: E0321 04:53:04.398767 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:04 crc kubenswrapper[4580]: E0321 04:53:04.499091 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:04 crc kubenswrapper[4580]: E0321 04:53:04.600118 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:04 crc kubenswrapper[4580]: E0321 04:53:04.700755 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:04 crc kubenswrapper[4580]: E0321 04:53:04.801772 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:04 crc kubenswrapper[4580]: E0321 04:53:04.902891 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:05 crc kubenswrapper[4580]: E0321 04:53:05.003276 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:05 crc kubenswrapper[4580]: E0321 04:53:05.103647 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:05 crc kubenswrapper[4580]: E0321 04:53:05.204746 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:05 crc kubenswrapper[4580]: E0321 04:53:05.305836 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:05 crc kubenswrapper[4580]: E0321 04:53:05.406262 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:05 crc kubenswrapper[4580]: E0321 04:53:05.506872 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:05 crc kubenswrapper[4580]: E0321 04:53:05.607568 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:05 crc kubenswrapper[4580]: E0321 04:53:05.680949 4580 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:53:05 crc kubenswrapper[4580]: E0321 04:53:05.708386 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:05 crc kubenswrapper[4580]: E0321 04:53:05.808930 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:05 crc kubenswrapper[4580]: E0321 04:53:05.909118 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:06 crc kubenswrapper[4580]: E0321 04:53:06.009487 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:06 crc kubenswrapper[4580]: E0321 04:53:06.109839 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:06 crc kubenswrapper[4580]: E0321 04:53:06.210830 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:06 crc kubenswrapper[4580]: E0321 04:53:06.311541 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:06 crc kubenswrapper[4580]: E0321 04:53:06.412412 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:06 crc kubenswrapper[4580]: E0321 04:53:06.512819 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:06 crc kubenswrapper[4580]: E0321 04:53:06.613035 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:06 crc kubenswrapper[4580]: E0321 04:53:06.713471 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:06 crc kubenswrapper[4580]: E0321 04:53:06.813582 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:06 crc kubenswrapper[4580]: E0321 04:53:06.914689 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:07 crc kubenswrapper[4580]: E0321 04:53:07.014865 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:07 crc kubenswrapper[4580]: E0321 04:53:07.115409 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:07 crc kubenswrapper[4580]: E0321 04:53:07.216429 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:07 crc kubenswrapper[4580]: E0321 04:53:07.317472 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:07 crc kubenswrapper[4580]: E0321 04:53:07.418224 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:07 crc kubenswrapper[4580]: E0321 04:53:07.518792 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:07 crc kubenswrapper[4580]: E0321 04:53:07.618982 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:07 crc kubenswrapper[4580]: E0321 04:53:07.719673 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:07 crc kubenswrapper[4580]: E0321 04:53:07.820692 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:07 crc kubenswrapper[4580]: E0321 04:53:07.922087 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:08 crc kubenswrapper[4580]: E0321 04:53:08.022373 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:08 crc kubenswrapper[4580]: E0321 04:53:08.122548 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:08 crc kubenswrapper[4580]: E0321 04:53:08.223273 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:08 crc kubenswrapper[4580]: E0321 04:53:08.324028 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:08 crc kubenswrapper[4580]: E0321 04:53:08.425118 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:08 crc kubenswrapper[4580]: E0321 04:53:08.526116 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:08 crc kubenswrapper[4580]: I0321 04:53:08.617618 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:53:08 crc kubenswrapper[4580]: I0321 04:53:08.618940 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:08 crc kubenswrapper[4580]: I0321 04:53:08.619061 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:08 crc kubenswrapper[4580]: I0321 04:53:08.619142 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:08 crc kubenswrapper[4580]: E0321 04:53:08.626744 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:08 crc kubenswrapper[4580]: E0321 04:53:08.727531 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:08 crc kubenswrapper[4580]: E0321 04:53:08.828454 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:08 crc kubenswrapper[4580]: E0321 04:53:08.928859 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:09 crc kubenswrapper[4580]: E0321 04:53:09.029709 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:09 crc kubenswrapper[4580]: E0321 04:53:09.130162 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:09 crc kubenswrapper[4580]: E0321 04:53:09.230989 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:09 crc kubenswrapper[4580]: E0321 04:53:09.331947 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:09 crc kubenswrapper[4580]: E0321 04:53:09.432590 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:09 crc kubenswrapper[4580]: E0321 04:53:09.533581 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:09 crc kubenswrapper[4580]: E0321 04:53:09.633719 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:09 crc kubenswrapper[4580]: E0321 04:53:09.734192 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:09 crc kubenswrapper[4580]: E0321 04:53:09.834672 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:09 crc kubenswrapper[4580]: E0321 04:53:09.935832 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:10 crc kubenswrapper[4580]: E0321 04:53:10.035997 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:10 crc kubenswrapper[4580]: E0321 04:53:10.137003 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:10 crc kubenswrapper[4580]: E0321 04:53:10.237607 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:10 crc kubenswrapper[4580]: E0321 04:53:10.338650 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:10 crc kubenswrapper[4580]: E0321 04:53:10.439850 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:10 crc kubenswrapper[4580]: E0321 04:53:10.540920 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:10 crc kubenswrapper[4580]: E0321 04:53:10.641224 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:10 crc kubenswrapper[4580]: E0321 04:53:10.741767 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:10 crc kubenswrapper[4580]: E0321 04:53:10.842565 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:10 crc kubenswrapper[4580]: E0321 04:53:10.942724 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:11 crc kubenswrapper[4580]: E0321 04:53:11.043687 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:11 crc kubenswrapper[4580]: E0321 04:53:11.144242 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:11 crc kubenswrapper[4580]: E0321 04:53:11.244627 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:11 crc kubenswrapper[4580]: E0321 04:53:11.344972 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:11 crc kubenswrapper[4580]: E0321 04:53:11.445434 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:11 crc kubenswrapper[4580]: E0321 04:53:11.546215 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:11 crc kubenswrapper[4580]: E0321 04:53:11.646918 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:11 crc kubenswrapper[4580]: E0321 04:53:11.747737 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:11 crc kubenswrapper[4580]: E0321 04:53:11.848610 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:11 crc kubenswrapper[4580]: E0321 04:53:11.949260 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:12 crc kubenswrapper[4580]: E0321 04:53:12.049612 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:12 crc kubenswrapper[4580]: E0321 04:53:12.150475 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:12 crc kubenswrapper[4580]: E0321 04:53:12.251569 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:12 crc kubenswrapper[4580]: E0321 04:53:12.351890 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:12 crc kubenswrapper[4580]: E0321 04:53:12.452641 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:12 crc kubenswrapper[4580]: E0321 04:53:12.553858 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:12 crc kubenswrapper[4580]: E0321 04:53:12.654514 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:12 crc kubenswrapper[4580]: E0321 04:53:12.755371 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:12 crc kubenswrapper[4580]: E0321 04:53:12.856211 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:12 crc kubenswrapper[4580]: E0321 04:53:12.957152 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.057271 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.157824 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.258631 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.358834 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.459441 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.559976 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.629190 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.633442 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.633479 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.633489 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.633505 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.633517 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:13Z","lastTransitionTime":"2026-03-21T04:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.647379 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.651837 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.651901 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.651925 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.651955 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.651982 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:13Z","lastTransitionTime":"2026-03-21T04:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.665237 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.668714 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.668749 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.668759 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.668772 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.668794 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:13Z","lastTransitionTime":"2026-03-21T04:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.678640 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.682577 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.682643 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.682655 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.682671 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:13 crc kubenswrapper[4580]: I0321 04:53:13.682680 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:13Z","lastTransitionTime":"2026-03-21T04:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.693363 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.693581 4580 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.693619 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.794703 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.894881 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:13 crc kubenswrapper[4580]: E0321 04:53:13.995475 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:14 crc kubenswrapper[4580]: E0321 04:53:14.096321 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:14 crc kubenswrapper[4580]: E0321 04:53:14.197110 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:14 crc kubenswrapper[4580]: E0321 04:53:14.297922 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:14 crc kubenswrapper[4580]: E0321 04:53:14.399046 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:14 crc kubenswrapper[4580]: E0321 04:53:14.500187 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:14 crc kubenswrapper[4580]: E0321 04:53:14.601186 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:14 crc kubenswrapper[4580]: I0321 04:53:14.617771 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:53:14 crc kubenswrapper[4580]: I0321 04:53:14.618936 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:14 crc kubenswrapper[4580]: I0321 04:53:14.618963 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:14 crc kubenswrapper[4580]: I0321 04:53:14.618971 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:14 crc kubenswrapper[4580]: I0321 04:53:14.619521 4580 scope.go:117] "RemoveContainer" containerID="4dac441b0b921acc01bd50e5e9a93d3f38fcdd018d7177f99a0b8764e536fb8b" Mar 21 04:53:14 crc kubenswrapper[4580]: E0321 04:53:14.619661 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:53:14 crc kubenswrapper[4580]: E0321 04:53:14.701837 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:14 crc kubenswrapper[4580]: E0321 04:53:14.802867 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:14 crc kubenswrapper[4580]: E0321 04:53:14.903489 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:15 crc kubenswrapper[4580]: E0321 04:53:15.004485 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:15 crc kubenswrapper[4580]: E0321 04:53:15.105034 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:15 crc kubenswrapper[4580]: E0321 04:53:15.206163 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:15 crc kubenswrapper[4580]: E0321 04:53:15.307128 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:15 crc kubenswrapper[4580]: E0321 04:53:15.407368 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:15 crc kubenswrapper[4580]: E0321 04:53:15.508492 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:15 crc kubenswrapper[4580]: E0321 04:53:15.609448 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:15 crc kubenswrapper[4580]: E0321 04:53:15.681891 4580 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:53:15 crc kubenswrapper[4580]: E0321 04:53:15.710574 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:15 crc kubenswrapper[4580]: E0321 04:53:15.811687 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:15 crc kubenswrapper[4580]: E0321 04:53:15.912065 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:16 crc kubenswrapper[4580]: E0321 04:53:16.012637 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:16 crc kubenswrapper[4580]: E0321 04:53:16.113815 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:16 crc kubenswrapper[4580]: E0321 04:53:16.214963 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:16 crc kubenswrapper[4580]: E0321 04:53:16.316623 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:16 crc kubenswrapper[4580]: E0321 04:53:16.418119 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:16 crc kubenswrapper[4580]: E0321 04:53:16.518306 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:16 crc kubenswrapper[4580]: E0321 04:53:16.619475 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:16 crc kubenswrapper[4580]: E0321 04:53:16.720665 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:16 crc kubenswrapper[4580]: E0321 04:53:16.820970 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:16 crc kubenswrapper[4580]: E0321 04:53:16.921661 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:17 crc kubenswrapper[4580]: E0321 04:53:17.022871 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:17 crc kubenswrapper[4580]: E0321 04:53:17.123488 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:17 crc kubenswrapper[4580]: E0321 04:53:17.224487 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:17 crc kubenswrapper[4580]: E0321 04:53:17.325539 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:17 crc kubenswrapper[4580]: E0321 04:53:17.425875 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:17 crc kubenswrapper[4580]: E0321 04:53:17.526338 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:17 crc kubenswrapper[4580]: E0321 04:53:17.627286 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:17 crc kubenswrapper[4580]: E0321 04:53:17.728164 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:17 crc kubenswrapper[4580]: E0321 04:53:17.828933 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:17 crc kubenswrapper[4580]: E0321 04:53:17.929216 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:18 crc kubenswrapper[4580]: E0321 04:53:18.030468 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:18 crc kubenswrapper[4580]: E0321 04:53:18.131252 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:18 crc kubenswrapper[4580]: E0321 04:53:18.232086 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:18 crc kubenswrapper[4580]: E0321 04:53:18.332856 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:18 crc kubenswrapper[4580]: E0321 04:53:18.433266 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:18 crc kubenswrapper[4580]: E0321 04:53:18.534296 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:18 crc kubenswrapper[4580]: E0321 04:53:18.634950 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:18 crc kubenswrapper[4580]: E0321 04:53:18.735904 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:18 crc kubenswrapper[4580]: E0321 04:53:18.836918 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:18 crc kubenswrapper[4580]: E0321 04:53:18.937201 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:19 crc kubenswrapper[4580]: E0321 04:53:19.037400 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:19 crc kubenswrapper[4580]: E0321 04:53:19.137909 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:19 crc kubenswrapper[4580]: E0321 04:53:19.238877 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:19 crc kubenswrapper[4580]: E0321 04:53:19.340642 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:19 crc kubenswrapper[4580]: E0321 04:53:19.441469 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:19 crc kubenswrapper[4580]: E0321 04:53:19.542300 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:19 crc kubenswrapper[4580]: E0321 04:53:19.642495 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:19 crc kubenswrapper[4580]: I0321 04:53:19.660142 4580 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 21 04:53:19 crc kubenswrapper[4580]: E0321 04:53:19.743287 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:19 crc kubenswrapper[4580]: E0321 04:53:19.843609 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:19 crc kubenswrapper[4580]: E0321 04:53:19.943862 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:20 crc kubenswrapper[4580]: E0321 04:53:20.044409 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:20 crc kubenswrapper[4580]: E0321 04:53:20.145418 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:20 crc kubenswrapper[4580]: E0321 04:53:20.245513 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:20 crc kubenswrapper[4580]: E0321 04:53:20.346536 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:20 crc kubenswrapper[4580]: E0321 04:53:20.447666 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:20 crc kubenswrapper[4580]: E0321 04:53:20.548064 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:20 crc kubenswrapper[4580]: E0321 04:53:20.649134 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:20 crc kubenswrapper[4580]: E0321 04:53:20.749950 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:20 crc kubenswrapper[4580]: E0321 04:53:20.851040 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:20 crc kubenswrapper[4580]: E0321 04:53:20.951239 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:21 crc kubenswrapper[4580]: E0321 04:53:21.051728 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:21 crc kubenswrapper[4580]: E0321 04:53:21.152348 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:21 crc kubenswrapper[4580]: E0321 04:53:21.253212 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:21 crc kubenswrapper[4580]: E0321 04:53:21.354126 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:21 crc kubenswrapper[4580]: E0321 04:53:21.454368 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:21 crc kubenswrapper[4580]: E0321 04:53:21.555179 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:21 crc kubenswrapper[4580]: E0321 04:53:21.656075 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:21 crc kubenswrapper[4580]: E0321 04:53:21.756795 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:21 crc kubenswrapper[4580]: E0321 04:53:21.857857 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:21 crc kubenswrapper[4580]: E0321 04:53:21.958275 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:22 crc kubenswrapper[4580]: E0321 04:53:22.058525 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:22 crc kubenswrapper[4580]: E0321 04:53:22.159594 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:22 crc kubenswrapper[4580]: E0321 04:53:22.260503 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:22 crc kubenswrapper[4580]: E0321 04:53:22.361396 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:22 crc kubenswrapper[4580]: E0321 04:53:22.461912 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:22 crc kubenswrapper[4580]: E0321 04:53:22.563061 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:22 crc kubenswrapper[4580]: E0321 04:53:22.664024 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:22 crc kubenswrapper[4580]: E0321 04:53:22.764960 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:22 crc kubenswrapper[4580]: E0321 04:53:22.865988 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:22 crc kubenswrapper[4580]: E0321 04:53:22.966482 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:23 crc kubenswrapper[4580]: E0321 04:53:23.067409 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:23 crc kubenswrapper[4580]: E0321 04:53:23.168166 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:23 crc kubenswrapper[4580]: E0321 04:53:23.268710 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:23 crc kubenswrapper[4580]: E0321 04:53:23.369170 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:23 crc kubenswrapper[4580]: E0321 04:53:23.470015 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:23 crc kubenswrapper[4580]: E0321 04:53:23.570298 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:23 crc kubenswrapper[4580]: E0321 04:53:23.670950 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:23 crc kubenswrapper[4580]: E0321 04:53:23.771581 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:23 crc kubenswrapper[4580]: E0321 04:53:23.872697 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:23 crc kubenswrapper[4580]: E0321 04:53:23.973606 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.013140 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.018618 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.018983 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.019184 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.019399 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.019612 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:24Z","lastTransitionTime":"2026-03-21T04:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.035135 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.038829 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.038863 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.038878 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.038899 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.038913 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:24Z","lastTransitionTime":"2026-03-21T04:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.050945 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.055411 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.055454 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.055467 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.055486 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.055503 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:24Z","lastTransitionTime":"2026-03-21T04:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.068157 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.078066 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.078118 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.078133 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.078164 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:24 crc kubenswrapper[4580]: I0321 04:53:24.078179 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:24Z","lastTransitionTime":"2026-03-21T04:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.091809 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.091991 4580 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.092032 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.192669 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.293395 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.394102 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.494303 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.595226 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.695689 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.796221 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.897027 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:24 crc kubenswrapper[4580]: E0321 04:53:24.997560 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:25 crc kubenswrapper[4580]: E0321 04:53:25.098686 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:25 crc kubenswrapper[4580]: E0321 04:53:25.199661 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:25 crc kubenswrapper[4580]: E0321 04:53:25.300224 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:25 crc kubenswrapper[4580]: E0321 04:53:25.401085 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:25 crc kubenswrapper[4580]: E0321 04:53:25.501564 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:25 crc kubenswrapper[4580]: E0321 04:53:25.601983 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:25 crc kubenswrapper[4580]: I0321 04:53:25.617715 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:53:25 crc kubenswrapper[4580]: I0321 04:53:25.618745 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:25 crc kubenswrapper[4580]: I0321 04:53:25.618805 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:25 crc kubenswrapper[4580]: I0321 04:53:25.618821 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:25 crc kubenswrapper[4580]: I0321 04:53:25.619462 4580 scope.go:117] "RemoveContainer" containerID="4dac441b0b921acc01bd50e5e9a93d3f38fcdd018d7177f99a0b8764e536fb8b" Mar 21 04:53:25 crc kubenswrapper[4580]: E0321 04:53:25.682040 4580 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:53:25 crc kubenswrapper[4580]: E0321 04:53:25.702914 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:25 crc kubenswrapper[4580]: E0321 04:53:25.803572 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:25 crc kubenswrapper[4580]: E0321 04:53:25.904257 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:26 crc kubenswrapper[4580]: E0321 04:53:26.005437 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:26 crc kubenswrapper[4580]: E0321 04:53:26.105628 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:26 crc kubenswrapper[4580]: I0321 04:53:26.107162 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:53:26 crc kubenswrapper[4580]: I0321 04:53:26.109390 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba"} Mar 21 04:53:26 crc kubenswrapper[4580]: I0321 04:53:26.109573 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:53:26 crc kubenswrapper[4580]: I0321 04:53:26.110632 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:26 crc kubenswrapper[4580]: I0321 04:53:26.110659 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:26 crc kubenswrapper[4580]: I0321 04:53:26.110668 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:26 crc kubenswrapper[4580]: E0321 04:53:26.206348 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:26 crc kubenswrapper[4580]: E0321 04:53:26.306835 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:26 crc kubenswrapper[4580]: E0321 04:53:26.407341 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:26 crc kubenswrapper[4580]: E0321 04:53:26.508354 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:26 crc kubenswrapper[4580]: E0321 04:53:26.608549 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:26 crc kubenswrapper[4580]: E0321 04:53:26.709372 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:26 crc kubenswrapper[4580]: E0321 04:53:26.810039 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:26 crc kubenswrapper[4580]: E0321 04:53:26.910670 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:27 crc kubenswrapper[4580]: E0321 04:53:27.011460 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:27 crc kubenswrapper[4580]: E0321 04:53:27.111668 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:27 crc kubenswrapper[4580]: I0321 04:53:27.114523 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 21 04:53:27 crc kubenswrapper[4580]: I0321 04:53:27.115147 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:53:27 crc kubenswrapper[4580]: I0321 04:53:27.117070 4580 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba" exitCode=255 Mar 21 04:53:27 crc kubenswrapper[4580]: I0321 04:53:27.117118 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba"} Mar 21 04:53:27 crc kubenswrapper[4580]: I0321 04:53:27.117204 4580 scope.go:117] "RemoveContainer" containerID="4dac441b0b921acc01bd50e5e9a93d3f38fcdd018d7177f99a0b8764e536fb8b" Mar 21 04:53:27 crc kubenswrapper[4580]: I0321 04:53:27.117353 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:53:27 crc kubenswrapper[4580]: I0321 04:53:27.118382 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:27 crc kubenswrapper[4580]: I0321 04:53:27.118447 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:27 crc kubenswrapper[4580]: I0321 04:53:27.118465 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:27 crc kubenswrapper[4580]: I0321 04:53:27.119284 4580 scope.go:117] "RemoveContainer" containerID="593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba" Mar 21 04:53:27 crc kubenswrapper[4580]: E0321 04:53:27.119515 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:53:27 crc kubenswrapper[4580]: E0321 04:53:27.211827 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:27 crc kubenswrapper[4580]: E0321 04:53:27.312411 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:27 crc kubenswrapper[4580]: E0321 04:53:27.412980 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:27 crc kubenswrapper[4580]: E0321 04:53:27.513898 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:27 crc kubenswrapper[4580]: E0321 04:53:27.614963 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:27 crc kubenswrapper[4580]: E0321 04:53:27.715893 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:27 crc kubenswrapper[4580]: E0321 04:53:27.816827 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:27 crc kubenswrapper[4580]: E0321 04:53:27.918054 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:28 crc kubenswrapper[4580]: E0321 04:53:28.018934 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:28 crc kubenswrapper[4580]: E0321 04:53:28.119213 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:28 crc kubenswrapper[4580]: I0321 04:53:28.122103 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 21 04:53:28 crc kubenswrapper[4580]: E0321 04:53:28.219842 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:28 crc kubenswrapper[4580]: E0321 04:53:28.320274 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:28 crc kubenswrapper[4580]: E0321 04:53:28.421131 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:28 crc kubenswrapper[4580]: E0321 04:53:28.522201 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:28 crc kubenswrapper[4580]: E0321 04:53:28.622383 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:28 crc kubenswrapper[4580]: E0321 04:53:28.723092 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:28 crc kubenswrapper[4580]: E0321 04:53:28.823670 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:28 crc kubenswrapper[4580]: E0321 04:53:28.924884 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:29 crc kubenswrapper[4580]: E0321 04:53:29.026037 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:29 crc kubenswrapper[4580]: E0321 04:53:29.126243 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:29 crc kubenswrapper[4580]: E0321 04:53:29.227179 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:29 crc kubenswrapper[4580]: E0321 04:53:29.327878 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:29 crc kubenswrapper[4580]: E0321 04:53:29.428960 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:29 crc kubenswrapper[4580]: E0321 04:53:29.529809 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:29 crc kubenswrapper[4580]: E0321 04:53:29.630524 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:29 crc kubenswrapper[4580]: E0321 04:53:29.730892 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:29 crc kubenswrapper[4580]: E0321 04:53:29.831515 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:29 crc kubenswrapper[4580]: E0321 04:53:29.931859 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:30 crc kubenswrapper[4580]: E0321 04:53:30.032587 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:30 crc kubenswrapper[4580]: E0321 04:53:30.132945 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:30 crc kubenswrapper[4580]: E0321 04:53:30.248700 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:30 crc kubenswrapper[4580]: E0321 04:53:30.348840 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:30 crc kubenswrapper[4580]: E0321 04:53:30.449502 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:30 crc kubenswrapper[4580]: E0321 04:53:30.550413 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:30 crc kubenswrapper[4580]: E0321 04:53:30.651122 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:30 crc kubenswrapper[4580]: E0321 04:53:30.752072 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:30 crc kubenswrapper[4580]: E0321 04:53:30.852527 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:30 crc kubenswrapper[4580]: E0321 04:53:30.953376 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:31 crc kubenswrapper[4580]: E0321 04:53:31.053877 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:31 crc kubenswrapper[4580]: E0321 04:53:31.154349 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:31 crc kubenswrapper[4580]: E0321 04:53:31.255430 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:31 crc kubenswrapper[4580]: E0321 04:53:31.356319 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:31 crc kubenswrapper[4580]: E0321 04:53:31.457469 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:31 crc kubenswrapper[4580]: E0321 04:53:31.557808 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:31 crc kubenswrapper[4580]: E0321 04:53:31.658680 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:31 crc kubenswrapper[4580]: E0321 04:53:31.759436 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:31 crc kubenswrapper[4580]: E0321 04:53:31.860134 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:31 crc kubenswrapper[4580]: E0321 04:53:31.960233 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:32 crc kubenswrapper[4580]: E0321 04:53:32.061002 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:32 crc kubenswrapper[4580]: E0321 04:53:32.162191 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:32 crc kubenswrapper[4580]: E0321 04:53:32.262981 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:32 crc kubenswrapper[4580]: E0321 04:53:32.363971 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:32 crc kubenswrapper[4580]: E0321 04:53:32.464849 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:32 crc kubenswrapper[4580]: E0321 04:53:32.565324 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:32 crc kubenswrapper[4580]: E0321 04:53:32.665972 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:32 crc kubenswrapper[4580]: E0321 04:53:32.766722 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:32 crc kubenswrapper[4580]: E0321 04:53:32.867471 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:32 crc kubenswrapper[4580]: E0321 04:53:32.968389 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:33 crc kubenswrapper[4580]: E0321 04:53:33.069068 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:33 crc kubenswrapper[4580]: E0321 04:53:33.169323 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:33 crc kubenswrapper[4580]: E0321 04:53:33.269984 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:33 crc kubenswrapper[4580]: E0321 04:53:33.370663 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:33 crc kubenswrapper[4580]: E0321 04:53:33.471334 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:33 crc kubenswrapper[4580]: I0321 04:53:33.569479 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:53:33 crc kubenswrapper[4580]: I0321 04:53:33.569699 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:53:33 crc kubenswrapper[4580]: I0321 04:53:33.571009 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:33 crc kubenswrapper[4580]: I0321 04:53:33.571057 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:33 crc kubenswrapper[4580]: I0321 04:53:33.571067 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:33 crc kubenswrapper[4580]: E0321 04:53:33.571438 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:33 crc kubenswrapper[4580]: I0321 04:53:33.571811 4580 scope.go:117] "RemoveContainer" containerID="593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba" Mar 21 04:53:33 crc kubenswrapper[4580]: E0321 04:53:33.572015 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:53:33 crc kubenswrapper[4580]: E0321 04:53:33.672101 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:33 crc kubenswrapper[4580]: E0321 04:53:33.772641 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:33 crc kubenswrapper[4580]: E0321 04:53:33.873205 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:33 crc kubenswrapper[4580]: E0321 04:53:33.973452 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.073960 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.174346 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.247949 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.254483 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.254544 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.254565 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.254812 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.254823 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:34Z","lastTransitionTime":"2026-03-21T04:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.265014 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.270017 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.270043 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.270052 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.270068 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.270080 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:34Z","lastTransitionTime":"2026-03-21T04:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.292797 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.298597 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.298626 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.298638 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.298659 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.298672 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:34Z","lastTransitionTime":"2026-03-21T04:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.312553 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.332171 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.332222 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.332240 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.332265 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.332288 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:34Z","lastTransitionTime":"2026-03-21T04:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.348650 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.348848 4580 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.348877 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.449718 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.550851 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.651895 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.752752 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.810219 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.814678 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.817509 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.817572 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.817590 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:34 crc kubenswrapper[4580]: I0321 04:53:34.818953 4580 scope.go:117] "RemoveContainer" containerID="593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba" Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.819265 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.852995 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:34 crc kubenswrapper[4580]: E0321 04:53:34.953397 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:35 crc kubenswrapper[4580]: E0321 04:53:35.053735 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:35 crc kubenswrapper[4580]: E0321 04:53:35.154504 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:35 crc kubenswrapper[4580]: E0321 04:53:35.255629 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:35 crc kubenswrapper[4580]: E0321 04:53:35.355895 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:35 crc kubenswrapper[4580]: E0321 04:53:35.456393 4580 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:53:35 crc kubenswrapper[4580]: E0321 04:53:35.559955 4580 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 21 04:53:35 crc kubenswrapper[4580]: E0321 04:53:35.682182 4580 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:53:35 crc kubenswrapper[4580]: E0321 04:53:35.696571 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:53:38 crc kubenswrapper[4580]: I0321 04:53:38.909760 4580 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 21 04:53:40 crc kubenswrapper[4580]: E0321 04:53:40.697847 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:53:44 crc kubenswrapper[4580]: E0321 04:53:44.509217 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.514913 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.514972 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.514981 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.514996 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.515006 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:44Z","lastTransitionTime":"2026-03-21T04:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:44 crc kubenswrapper[4580]: E0321 04:53:44.526636 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.531148 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.531194 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.531207 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.531226 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.531239 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:44Z","lastTransitionTime":"2026-03-21T04:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:44 crc kubenswrapper[4580]: E0321 04:53:44.542465 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.547361 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.547416 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.547426 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.547441 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.547451 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:44Z","lastTransitionTime":"2026-03-21T04:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:44 crc kubenswrapper[4580]: E0321 04:53:44.557921 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.561718 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.562467 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.562511 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.562538 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.562552 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:44Z","lastTransitionTime":"2026-03-21T04:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:44 crc kubenswrapper[4580]: E0321 04:53:44.572630 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:44 crc kubenswrapper[4580]: E0321 04:53:44.572796 4580 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.617038 4580 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.618239 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.618277 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.618286 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:44 crc kubenswrapper[4580]: I0321 04:53:44.855179 4580 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.591956 4580 apiserver.go:52] "Watching apiserver" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.598968 4580 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.599605 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-fpb6h","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz","openshift-image-registry/node-ca-qhc9t","openshift-multus/multus-additional-cni-plugins-gk68q","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-2pzl9","openshift-dns/node-resolver-j7s9f","openshift-machine-config-operator/machine-config-daemon-7w8lj","openshift-multus/multus-z5bcs","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/iptables-alerter-4ln5h"] Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.600131 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.600205 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.600203 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.600268 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.600292 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.600345 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.600332 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.600353 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.600703 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.600929 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.600962 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.601341 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j7s9f" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.601411 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.601465 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.601502 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.601562 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.601924 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.601993 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qhc9t" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.605244 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.606750 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.606969 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.607226 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.607387 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.607514 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.607696 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.608640 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.608644 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.608865 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.608769 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.608969 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.609010 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.609118 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.609172 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.609337 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.609464 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.609574 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.609667 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.612306 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.612735 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.613557 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.614131 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.614613 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.614852 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.615469 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.615706 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.615760 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.615892 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.616077 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.616105 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.616485 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.616926 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.617336 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.617477 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.617596 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.619497 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.635185 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.642653 4580 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.647611 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.649977 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650013 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650035 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650058 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650080 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650099 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650118 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650138 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650156 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650179 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650196 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650216 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650233 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650252 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650270 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650288 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650323 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650346 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650363 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650384 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650401 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650417 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650434 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650452 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650469 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650491 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650513 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650531 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650548 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650570 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650588 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650611 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650636 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650662 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650692 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650720 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650741 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650759 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650793 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650817 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650840 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650866 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650886 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650910 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650929 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650949 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650970 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.650990 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651012 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651034 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651053 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651070 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651087 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651105 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651123 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651139 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651154 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651173 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651189 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651210 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651228 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651244 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651260 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651276 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651291 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651307 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651324 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651343 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651362 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651378 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651395 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651417 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651435 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651452 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651468 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651486 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651503 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651519 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651535 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651557 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651576 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651596 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651615 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651634 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651652 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651671 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651695 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651718 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651741 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651766 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651812 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651840 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651866 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651890 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651884 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651912 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651939 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651969 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.651998 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652024 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652068 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652094 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652128 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652149 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652168 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652190 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652211 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652237 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652261 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652285 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652306 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652329 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652350 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652373 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652397 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652425 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652451 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652480 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652505 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652529 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652552 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652573 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652601 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652627 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652653 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652674 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652692 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652709 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652727 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652743 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652762 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652800 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652819 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652837 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652853 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652872 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652892 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652915 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652959 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652996 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653021 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653058 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653079 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653098 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653116 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653134 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653154 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653172 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653191 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653208 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653226 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653243 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653306 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653327 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653398 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653419 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653438 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653457 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653476 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653497 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653516 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653534 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653556 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653575 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653595 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653614 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653632 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653767 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653851 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653916 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653939 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653957 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653981 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654001 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654022 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654046 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654074 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654105 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654133 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654164 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654189 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654210 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654266 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654287 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654307 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654331 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654351 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654374 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654394 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654413 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654434 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654454 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654473 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654492 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654513 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654532 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654623 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654647 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654670 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654694 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654720 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-var-lib-cni-bin\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654743 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-env-overrides\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654771 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654811 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shb8k\" (UniqueName: \"kubernetes.io/projected/2b33648e-09ea-47e5-a32d-8bc5f0209e92-kube-api-access-shb8k\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654832 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654853 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-run-netns\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654871 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-multus-conf-dir\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654893 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-node-log\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654912 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-run-k8s-cni-cncf-io\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654932 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7s5q\" (UniqueName: \"kubernetes.io/projected/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-kube-api-access-c7s5q\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654951 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a9668dcb-27e6-469d-aa01-da4dc9cf6664-rootfs\") pod \"machine-config-daemon-7w8lj\" (UID: \"a9668dcb-27e6-469d-aa01-da4dc9cf6664\") " pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654968 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-cni-netd\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654988 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655010 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655133 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-system-cni-dir\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655152 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-multus-cni-dir\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655178 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655197 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovnkube-config\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655230 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ab49cffe-6918-451d-bbf0-8933c7395982-hosts-file\") pod \"node-resolver-j7s9f\" (UID: \"ab49cffe-6918-451d-bbf0-8933c7395982\") " pod="openshift-dns/node-resolver-j7s9f" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655260 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655288 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-var-lib-kubelet\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655310 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/39312d7d-2530-4274-a347-e32998996270-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mh4jz\" (UID: \"39312d7d-2530-4274-a347-e32998996270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655330 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655351 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-var-lib-cni-multus\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655369 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-hostroot\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655391 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/39312d7d-2530-4274-a347-e32998996270-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mh4jz\" (UID: \"39312d7d-2530-4274-a347-e32998996270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655412 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmqbk\" (UniqueName: \"kubernetes.io/projected/39312d7d-2530-4274-a347-e32998996270-kube-api-access-cmqbk\") pod \"ovnkube-control-plane-749d76644c-mh4jz\" (UID: \"39312d7d-2530-4274-a347-e32998996270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655426 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-multus-daemon-config\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655448 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655466 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655484 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9668dcb-27e6-469d-aa01-da4dc9cf6664-proxy-tls\") pod \"machine-config-daemon-7w8lj\" (UID: \"a9668dcb-27e6-469d-aa01-da4dc9cf6664\") " pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655502 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-systemd\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655519 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-log-socket\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655534 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-cnibin\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655554 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-cni-binary-copy\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655572 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-run-multus-certs\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655589 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-run-netns\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655604 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-etc-openvswitch\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655621 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-openvswitch\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655638 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b0217ec0-8db1-4e76-bda0-e6299469b59c-system-cni-dir\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655658 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/39312d7d-2530-4274-a347-e32998996270-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mh4jz\" (UID: \"39312d7d-2530-4274-a347-e32998996270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655675 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-kubelet\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655692 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-slash\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655710 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovn-node-metrics-cert\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655728 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b0217ec0-8db1-4e76-bda0-e6299469b59c-os-release\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655743 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-os-release\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655763 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sjrq\" (UniqueName: \"kubernetes.io/projected/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-kube-api-access-6sjrq\") pod \"network-metrics-daemon-fpb6h\" (UID: \"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\") " pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655797 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovnkube-script-lib\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655815 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b0217ec0-8db1-4e76-bda0-e6299469b59c-cnibin\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655831 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b0217ec0-8db1-4e76-bda0-e6299469b59c-cni-binary-copy\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655859 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655878 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-multus-socket-dir-parent\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655898 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c44c4ee-33a3-482b-b409-8ad89483790d-host\") pod \"node-ca-qhc9t\" (UID: \"1c44c4ee-33a3-482b-b409-8ad89483790d\") " pod="openshift-image-registry/node-ca-qhc9t" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655917 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a9668dcb-27e6-469d-aa01-da4dc9cf6664-mcd-auth-proxy-config\") pod \"machine-config-daemon-7w8lj\" (UID: \"a9668dcb-27e6-469d-aa01-da4dc9cf6664\") " pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655936 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2k2d\" (UniqueName: \"kubernetes.io/projected/ab49cffe-6918-451d-bbf0-8933c7395982-kube-api-access-j2k2d\") pod \"node-resolver-j7s9f\" (UID: \"ab49cffe-6918-451d-bbf0-8933c7395982\") " pod="openshift-dns/node-resolver-j7s9f" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655954 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b0217ec0-8db1-4e76-bda0-e6299469b59c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655976 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc78n\" (UniqueName: \"kubernetes.io/projected/b0217ec0-8db1-4e76-bda0-e6299469b59c-kube-api-access-pc78n\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655993 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c44c4ee-33a3-482b-b409-8ad89483790d-serviceca\") pod \"node-ca-qhc9t\" (UID: \"1c44c4ee-33a3-482b-b409-8ad89483790d\") " pod="openshift-image-registry/node-ca-qhc9t" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.656011 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-var-lib-openvswitch\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.656026 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-ovn\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.656043 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-cni-bin\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.656061 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xphxp\" (UniqueName: \"kubernetes.io/projected/1c44c4ee-33a3-482b-b409-8ad89483790d-kube-api-access-xphxp\") pod \"node-ca-qhc9t\" (UID: \"1c44c4ee-33a3-482b-b409-8ad89483790d\") " pod="openshift-image-registry/node-ca-qhc9t" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.656079 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-etc-kubernetes\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.656101 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78cs2\" (UniqueName: \"kubernetes.io/projected/a9668dcb-27e6-469d-aa01-da4dc9cf6664-kube-api-access-78cs2\") pod \"machine-config-daemon-7w8lj\" (UID: \"a9668dcb-27e6-469d-aa01-da4dc9cf6664\") " pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.656117 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs\") pod \"network-metrics-daemon-fpb6h\" (UID: \"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\") " pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.656134 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-systemd-units\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.656151 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-run-ovn-kubernetes\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.656169 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b0217ec0-8db1-4e76-bda0-e6299469b59c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.656191 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.656266 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.663462 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.678459 4580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.681691 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.684275 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652164 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652361 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.684462 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652597 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652852 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.684631 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.652959 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653062 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653312 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653559 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.653767 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654258 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654266 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654487 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654714 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.654992 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655089 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655211 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655378 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655423 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.655920 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.656029 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.656441 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.659194 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.659293 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.659461 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.659492 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.659524 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.659807 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.659829 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.659837 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.660151 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.684949 4580 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.660487 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.660811 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.660903 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.661203 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.661353 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.661411 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.661553 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.661915 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.661960 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.662137 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.662315 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.662606 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.663313 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.663592 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.663876 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.663958 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.665078 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.665326 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.665514 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.665701 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.666587 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.666747 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.667019 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.667275 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.667380 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.667493 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.667625 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.667692 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.667713 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.667823 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.667869 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.668031 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.668292 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.668170 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.668852 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.669302 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.669506 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.669880 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.670145 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.670227 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.670505 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.670538 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.670845 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.670883 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.671149 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.670210 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.671263 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.671304 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.671511 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.671729 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.672427 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.672467 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.672512 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.672578 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.672677 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.672768 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.673734 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.673915 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.674154 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.674290 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.674504 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.674760 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.674847 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.674872 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.674939 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.675134 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.675417 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.675943 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.676329 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.676308 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.676372 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.685368 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.676421 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.676591 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.676687 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.676748 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.676880 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.677001 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.677087 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.677090 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.677124 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.677384 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.678042 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.678151 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.678514 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.678713 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.678733 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.678846 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.679054 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.679669 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.679819 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.679910 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.680322 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.680704 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.680892 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.680954 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.681151 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.681171 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.681321 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.681450 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.681448 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.682104 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.682306 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:53:46.182277657 +0000 UTC m=+131.264861295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.682354 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.685876 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.682395 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.682936 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.683545 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.683669 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.683865 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.684324 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.684441 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.684852 4580 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.686387 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.660370 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.686412 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.686758 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.686977 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.687417 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.687644 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.687722 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.687912 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:46.187889108 +0000 UTC m=+131.270472806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.688365 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:46.188352128 +0000 UTC m=+131.270935846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.688473 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.688884 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.689497 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.689557 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.693033 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.693411 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.694020 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.699414 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.702466 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.706417 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.706435 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.706616 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.707043 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.707062 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.712948 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.713200 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.713401 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.713463 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.713473 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.714022 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.714232 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.714256 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.714270 4580 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.714334 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:46.214313911 +0000 UTC m=+131.296897539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.714414 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.714457 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.714476 4580 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.714532 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:46.214513885 +0000 UTC m=+131.297097703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.716980 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.717086 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.718013 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.718112 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.718823 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.719045 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.721082 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.721139 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.721472 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.721682 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.721920 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.723250 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.723589 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.723504 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.723716 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.723732 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.723717 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.724074 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.724104 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.724173 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.724618 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.724812 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.724910 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.725079 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.725666 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.725739 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.725949 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.730139 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.731512 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.748030 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.750280 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.752220 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.756465 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.756983 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-var-lib-cni-bin\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757024 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-env-overrides\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757069 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-run-netns\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757088 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-multus-conf-dir\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757107 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shb8k\" (UniqueName: \"kubernetes.io/projected/2b33648e-09ea-47e5-a32d-8bc5f0209e92-kube-api-access-shb8k\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757134 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-node-log\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757152 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-run-k8s-cni-cncf-io\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757195 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-run-k8s-cni-cncf-io\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757206 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7s5q\" (UniqueName: \"kubernetes.io/projected/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-kube-api-access-c7s5q\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757228 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-var-lib-cni-bin\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757243 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-cni-netd\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757296 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757322 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-system-cni-dir\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757344 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-multus-cni-dir\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757394 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a9668dcb-27e6-469d-aa01-da4dc9cf6664-rootfs\") pod \"machine-config-daemon-7w8lj\" (UID: \"a9668dcb-27e6-469d-aa01-da4dc9cf6664\") " pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757427 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovnkube-config\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757451 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ab49cffe-6918-451d-bbf0-8933c7395982-hosts-file\") pod \"node-resolver-j7s9f\" (UID: \"ab49cffe-6918-451d-bbf0-8933c7395982\") " pod="openshift-dns/node-resolver-j7s9f" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757483 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-var-lib-kubelet\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757501 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/39312d7d-2530-4274-a347-e32998996270-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mh4jz\" (UID: \"39312d7d-2530-4274-a347-e32998996270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757511 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-run-netns\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757555 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757519 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757595 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-node-log\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757618 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-var-lib-cni-multus\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757625 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-var-lib-kubelet\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757597 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757643 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-hostroot\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757652 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a9668dcb-27e6-469d-aa01-da4dc9cf6664-rootfs\") pod \"machine-config-daemon-7w8lj\" (UID: \"a9668dcb-27e6-469d-aa01-da4dc9cf6664\") " pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757667 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/39312d7d-2530-4274-a347-e32998996270-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mh4jz\" (UID: \"39312d7d-2530-4274-a347-e32998996270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757690 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmqbk\" (UniqueName: \"kubernetes.io/projected/39312d7d-2530-4274-a347-e32998996270-kube-api-access-cmqbk\") pod \"ovnkube-control-plane-749d76644c-mh4jz\" (UID: \"39312d7d-2530-4274-a347-e32998996270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757611 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-multus-conf-dir\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757721 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-multus-cni-dir\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757714 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757819 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757844 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-hostroot\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757873 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9668dcb-27e6-469d-aa01-da4dc9cf6664-proxy-tls\") pod \"machine-config-daemon-7w8lj\" (UID: \"a9668dcb-27e6-469d-aa01-da4dc9cf6664\") " pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757900 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-systemd\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757923 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-log-socket\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757943 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-cnibin\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757963 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-cni-binary-copy\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.757986 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-multus-daemon-config\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758006 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ab49cffe-6918-451d-bbf0-8933c7395982-hosts-file\") pod \"node-resolver-j7s9f\" (UID: \"ab49cffe-6918-451d-bbf0-8933c7395982\") " pod="openshift-dns/node-resolver-j7s9f" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758074 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-log-socket\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758076 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-run-multus-certs\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758098 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-run-multus-certs\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758105 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-cni-netd\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758119 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-etc-openvswitch\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758146 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-openvswitch\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758172 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b0217ec0-8db1-4e76-bda0-e6299469b59c-system-cni-dir\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758197 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/39312d7d-2530-4274-a347-e32998996270-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mh4jz\" (UID: \"39312d7d-2530-4274-a347-e32998996270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758225 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-run-netns\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758253 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-kubelet\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758075 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758146 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-system-cni-dir\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758368 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-slash\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758399 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovn-node-metrics-cert\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758425 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b0217ec0-8db1-4e76-bda0-e6299469b59c-os-release\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758450 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sjrq\" (UniqueName: \"kubernetes.io/projected/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-kube-api-access-6sjrq\") pod \"network-metrics-daemon-fpb6h\" (UID: \"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\") " pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758470 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovnkube-script-lib\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758495 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b0217ec0-8db1-4e76-bda0-e6299469b59c-cnibin\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758520 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b0217ec0-8db1-4e76-bda0-e6299469b59c-cni-binary-copy\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758549 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-os-release\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758572 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-multus-socket-dir-parent\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758684 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c44c4ee-33a3-482b-b409-8ad89483790d-host\") pod \"node-ca-qhc9t\" (UID: \"1c44c4ee-33a3-482b-b409-8ad89483790d\") " pod="openshift-image-registry/node-ca-qhc9t" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758730 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a9668dcb-27e6-469d-aa01-da4dc9cf6664-mcd-auth-proxy-config\") pod \"machine-config-daemon-7w8lj\" (UID: \"a9668dcb-27e6-469d-aa01-da4dc9cf6664\") " pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758753 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2k2d\" (UniqueName: \"kubernetes.io/projected/ab49cffe-6918-451d-bbf0-8933c7395982-kube-api-access-j2k2d\") pod \"node-resolver-j7s9f\" (UID: \"ab49cffe-6918-451d-bbf0-8933c7395982\") " pod="openshift-dns/node-resolver-j7s9f" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758795 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b0217ec0-8db1-4e76-bda0-e6299469b59c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758823 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc78n\" (UniqueName: \"kubernetes.io/projected/b0217ec0-8db1-4e76-bda0-e6299469b59c-kube-api-access-pc78n\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758845 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c44c4ee-33a3-482b-b409-8ad89483790d-serviceca\") pod \"node-ca-qhc9t\" (UID: \"1c44c4ee-33a3-482b-b409-8ad89483790d\") " pod="openshift-image-registry/node-ca-qhc9t" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758869 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-var-lib-openvswitch\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758898 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-ovn\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758918 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-cni-bin\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758939 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xphxp\" (UniqueName: \"kubernetes.io/projected/1c44c4ee-33a3-482b-b409-8ad89483790d-kube-api-access-xphxp\") pod \"node-ca-qhc9t\" (UID: \"1c44c4ee-33a3-482b-b409-8ad89483790d\") " pod="openshift-image-registry/node-ca-qhc9t" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758940 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovnkube-config\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.758964 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78cs2\" (UniqueName: \"kubernetes.io/projected/a9668dcb-27e6-469d-aa01-da4dc9cf6664-kube-api-access-78cs2\") pod \"machine-config-daemon-7w8lj\" (UID: \"a9668dcb-27e6-469d-aa01-da4dc9cf6664\") " pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.759033 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs\") pod \"network-metrics-daemon-fpb6h\" (UID: \"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\") " pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.759142 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-systemd-units\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.759153 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-multus-socket-dir-parent\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.759170 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-run-ovn-kubernetes\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.759177 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-kubelet\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.759204 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-etc-openvswitch\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.759229 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-openvswitch\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.759232 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-run-ovn-kubernetes\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.759259 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b0217ec0-8db1-4e76-bda0-e6299469b59c-system-cni-dir\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.759325 4580 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:53:45 crc kubenswrapper[4580]: E0321 04:53:45.759411 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs podName:ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:46.259390668 +0000 UTC m=+131.341974296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs") pod "network-metrics-daemon-fpb6h" (UID: "ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.759700 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-systemd-units\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.760716 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-host-var-lib-cni-multus\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.760759 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-run-netns\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.760811 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-slash\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.761709 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/39312d7d-2530-4274-a347-e32998996270-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mh4jz\" (UID: \"39312d7d-2530-4274-a347-e32998996270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.762029 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-systemd\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.762404 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-cni-binary-copy\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.762501 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-var-lib-openvswitch\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.763377 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b0217ec0-8db1-4e76-bda0-e6299469b59c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.765187 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c44c4ee-33a3-482b-b409-8ad89483790d-serviceca\") pod \"node-ca-qhc9t\" (UID: \"1c44c4ee-33a3-482b-b409-8ad89483790d\") " pod="openshift-image-registry/node-ca-qhc9t" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.765241 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b0217ec0-8db1-4e76-bda0-e6299469b59c-os-release\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.765262 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovn-node-metrics-cert\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.765349 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-ovn\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.765426 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-cni-bin\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.765731 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b0217ec0-8db1-4e76-bda0-e6299469b59c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.765770 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-etc-kubernetes\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.766453 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b0217ec0-8db1-4e76-bda0-e6299469b59c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.766484 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-etc-kubernetes\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.767252 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-cnibin\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.767285 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b0217ec0-8db1-4e76-bda0-e6299469b59c-cnibin\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.767946 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovnkube-script-lib\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.767997 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-os-release\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768391 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b0217ec0-8db1-4e76-bda0-e6299469b59c-cni-binary-copy\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768433 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c44c4ee-33a3-482b-b409-8ad89483790d-host\") pod \"node-ca-qhc9t\" (UID: \"1c44c4ee-33a3-482b-b409-8ad89483790d\") " pod="openshift-image-registry/node-ca-qhc9t" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768687 4580 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768704 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768717 4580 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768731 4580 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768741 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768750 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768759 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768768 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768777 4580 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768809 4580 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768818 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768831 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768840 4580 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768848 4580 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768859 4580 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768872 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768880 4580 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768889 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768898 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768905 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768914 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768926 4580 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768940 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768955 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768963 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768971 4580 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768980 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768988 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.768996 4580 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769008 4580 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769017 4580 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769027 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769036 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769044 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769052 4580 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769060 4580 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769068 4580 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769077 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769087 4580 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769096 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769104 4580 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769112 4580 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769129 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769137 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769146 4580 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769159 4580 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769167 4580 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769175 4580 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769189 4580 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769197 4580 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769205 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769214 4580 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769223 4580 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769231 4580 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769240 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769253 4580 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769263 4580 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769272 4580 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769281 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769294 4580 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769304 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769312 4580 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769324 4580 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769332 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769341 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769350 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769362 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769372 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769380 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769419 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769430 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769440 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769449 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769462 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769475 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769485 4580 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769496 4580 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769506 4580 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769519 4580 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769531 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769540 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769551 4580 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769562 4580 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769572 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769581 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769591 4580 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769601 4580 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769610 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769619 4580 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769629 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769637 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769646 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769655 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769666 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769675 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769695 4580 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769705 4580 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769712 4580 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769721 4580 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769729 4580 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769738 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769745 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769753 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769764 4580 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769771 4580 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769820 4580 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769834 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769842 4580 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769850 4580 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769858 4580 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769868 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769877 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769885 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769895 4580 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769905 4580 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769914 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769922 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769930 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769937 4580 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769946 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769953 4580 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769964 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769974 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769982 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769990 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.769998 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770006 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770016 4580 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770024 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770032 4580 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770039 4580 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770052 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770063 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770071 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770079 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770087 4580 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770095 4580 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770103 4580 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770111 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770119 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770127 4580 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770136 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770148 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770155 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770163 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770171 4580 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770178 4580 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770189 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770197 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770204 4580 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770214 4580 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770221 4580 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770229 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770238 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770250 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770257 4580 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770265 4580 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770273 4580 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770280 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770290 4580 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770303 4580 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770313 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770321 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770329 4580 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770337 4580 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770345 4580 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770352 4580 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770366 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770374 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770385 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770392 4580 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770400 4580 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770409 4580 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770417 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770427 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770434 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770442 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770451 4580 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770460 4580 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770484 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770492 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770501 4580 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770510 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770519 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770527 4580 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770539 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770548 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770556 4580 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770564 4580 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770572 4580 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770580 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770592 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.770600 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.771596 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9668dcb-27e6-469d-aa01-da4dc9cf6664-proxy-tls\") pod \"machine-config-daemon-7w8lj\" (UID: \"a9668dcb-27e6-469d-aa01-da4dc9cf6664\") " pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.774175 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/39312d7d-2530-4274-a347-e32998996270-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mh4jz\" (UID: \"39312d7d-2530-4274-a347-e32998996270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.776199 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/39312d7d-2530-4274-a347-e32998996270-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mh4jz\" (UID: \"39312d7d-2530-4274-a347-e32998996270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.778400 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shb8k\" (UniqueName: \"kubernetes.io/projected/2b33648e-09ea-47e5-a32d-8bc5f0209e92-kube-api-access-shb8k\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.779575 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a9668dcb-27e6-469d-aa01-da4dc9cf6664-mcd-auth-proxy-config\") pod \"machine-config-daemon-7w8lj\" (UID: \"a9668dcb-27e6-469d-aa01-da4dc9cf6664\") " pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.780042 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-multus-daemon-config\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.780418 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-env-overrides\") pod \"ovnkube-node-2pzl9\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.783693 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc78n\" (UniqueName: \"kubernetes.io/projected/b0217ec0-8db1-4e76-bda0-e6299469b59c-kube-api-access-pc78n\") pod \"multus-additional-cni-plugins-gk68q\" (UID: \"b0217ec0-8db1-4e76-bda0-e6299469b59c\") " pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.785090 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78cs2\" (UniqueName: \"kubernetes.io/projected/a9668dcb-27e6-469d-aa01-da4dc9cf6664-kube-api-access-78cs2\") pod \"machine-config-daemon-7w8lj\" (UID: \"a9668dcb-27e6-469d-aa01-da4dc9cf6664\") " pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.785548 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.786742 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xphxp\" (UniqueName: \"kubernetes.io/projected/1c44c4ee-33a3-482b-b409-8ad89483790d-kube-api-access-xphxp\") pod \"node-ca-qhc9t\" (UID: \"1c44c4ee-33a3-482b-b409-8ad89483790d\") " pod="openshift-image-registry/node-ca-qhc9t" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.787491 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmqbk\" (UniqueName: \"kubernetes.io/projected/39312d7d-2530-4274-a347-e32998996270-kube-api-access-cmqbk\") pod \"ovnkube-control-plane-749d76644c-mh4jz\" (UID: \"39312d7d-2530-4274-a347-e32998996270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.788188 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2k2d\" (UniqueName: \"kubernetes.io/projected/ab49cffe-6918-451d-bbf0-8933c7395982-kube-api-access-j2k2d\") pod \"node-resolver-j7s9f\" (UID: \"ab49cffe-6918-451d-bbf0-8933c7395982\") " pod="openshift-dns/node-resolver-j7s9f" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.788276 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sjrq\" (UniqueName: \"kubernetes.io/projected/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-kube-api-access-6sjrq\") pod \"network-metrics-daemon-fpb6h\" (UID: \"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\") " pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.788735 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7s5q\" (UniqueName: \"kubernetes.io/projected/f6761e28-8a0c-4ea2-b248-2bd60e3862e6-kube-api-access-c7s5q\") pod \"multus-z5bcs\" (UID: \"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\") " pod="openshift-multus/multus-z5bcs" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.795108 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.806891 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.820025 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.835916 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.847133 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.861209 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.874539 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.887757 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.901267 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.914341 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.920296 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.924846 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.935990 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.936950 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.946631 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.947302 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.957647 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.961245 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: W0321 04:53:45.969301 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-cd7a475e6a045669c1aa9676d2f15fd8043141e8adeb42c9413fe2bb4cc745a3 WatchSource:0}: Error finding container cd7a475e6a045669c1aa9676d2f15fd8043141e8adeb42c9413fe2bb4cc745a3: Status 404 returned error can't find the container with id cd7a475e6a045669c1aa9676d2f15fd8043141e8adeb42c9413fe2bb4cc745a3 Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.971194 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.972937 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: W0321 04:53:45.974245 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9668dcb_27e6_469d_aa01_da4dc9cf6664.slice/crio-760fe6f79c9cb5e6fb1d0b584d27df9092f031a75621fd3d86b7624c1cbb7939 WatchSource:0}: Error finding container 760fe6f79c9cb5e6fb1d0b584d27df9092f031a75621fd3d86b7624c1cbb7939: Status 404 returned error can't find the container with id 760fe6f79c9cb5e6fb1d0b584d27df9092f031a75621fd3d86b7624c1cbb7939 Mar 21 04:53:45 crc kubenswrapper[4580]: I0321 04:53:45.986851 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:45 crc kubenswrapper[4580]: W0321 04:53:45.990870 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-90caf3080caee3ef82fbdc2c0d1a153184667ef5d9c52a8b9eb314a935eb1831 WatchSource:0}: Error finding container 90caf3080caee3ef82fbdc2c0d1a153184667ef5d9c52a8b9eb314a935eb1831: Status 404 returned error can't find the container with id 90caf3080caee3ef82fbdc2c0d1a153184667ef5d9c52a8b9eb314a935eb1831 Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.004035 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.010749 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.018350 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j7s9f" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.028886 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.031004 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z5bcs" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.040241 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.042595 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gk68q" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.049651 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.051514 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" Mar 21 04:53:46 crc kubenswrapper[4580]: W0321 04:53:46.067502 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6761e28_8a0c_4ea2_b248_2bd60e3862e6.slice/crio-242646bb123af48ff63a64db961fa29089d9739f532fc62e5e56d4c0de8eaf35 WatchSource:0}: Error finding container 242646bb123af48ff63a64db961fa29089d9739f532fc62e5e56d4c0de8eaf35: Status 404 returned error can't find the container with id 242646bb123af48ff63a64db961fa29089d9739f532fc62e5e56d4c0de8eaf35 Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.074733 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qhc9t" Mar 21 04:53:46 crc kubenswrapper[4580]: W0321 04:53:46.110239 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39312d7d_2530_4274_a347_e32998996270.slice/crio-9ec388955e19c744f833fb6f35608e28784a5d45d3dbf2b49d05a0ec800eb447 WatchSource:0}: Error finding container 9ec388955e19c744f833fb6f35608e28784a5d45d3dbf2b49d05a0ec800eb447: Status 404 returned error can't find the container with id 9ec388955e19c744f833fb6f35608e28784a5d45d3dbf2b49d05a0ec800eb447 Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.194424 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed"} Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.194487 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"760fe6f79c9cb5e6fb1d0b584d27df9092f031a75621fd3d86b7624c1cbb7939"} Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.198606 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" event={"ID":"39312d7d-2530-4274-a347-e32998996270","Type":"ContainerStarted","Data":"9ec388955e19c744f833fb6f35608e28784a5d45d3dbf2b49d05a0ec800eb447"} Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.201447 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j7s9f" event={"ID":"ab49cffe-6918-451d-bbf0-8933c7395982","Type":"ContainerStarted","Data":"a52506ec4fd749ab70779592acc6af237cceb6d976b06c5c9e8e4ca9069f52e9"} Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.202742 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a22e5830a82c10b8d669a7bda4f060b729e6b384accaf65b7ac08f0574bb592f"} Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.205342 4580 generic.go:334] "Generic (PLEG): container finished" podID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerID="2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623" exitCode=0 Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.205412 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerDied","Data":"2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623"} Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.205440 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerStarted","Data":"82a82ffa831bf919d6f70d37fa1437ec01369365dc51cd7c6e8f31df8b28fcdf"} Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.208231 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429"} Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.208328 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"90caf3080caee3ef82fbdc2c0d1a153184667ef5d9c52a8b9eb314a935eb1831"} Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.215048 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"cd7a475e6a045669c1aa9676d2f15fd8043141e8adeb42c9413fe2bb4cc745a3"} Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.217415 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z5bcs" event={"ID":"f6761e28-8a0c-4ea2-b248-2bd60e3862e6","Type":"ContainerStarted","Data":"242646bb123af48ff63a64db961fa29089d9739f532fc62e5e56d4c0de8eaf35"} Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.221736 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qhc9t" event={"ID":"1c44c4ee-33a3-482b-b409-8ad89483790d","Type":"ContainerStarted","Data":"fa50abd69010f09b4047489e25b3d607062b82be2a8ec6191f988aa6354c67c2"} Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.223458 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.228043 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" event={"ID":"b0217ec0-8db1-4e76-bda0-e6299469b59c","Type":"ContainerStarted","Data":"91af410d98ea734ef7953a43d18e28d3d35e7f87ea66d2be3c4e872897062450"} Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.247508 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.262711 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.276886 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.278243 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.278412 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.278481 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs\") pod \"network-metrics-daemon-fpb6h\" (UID: \"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\") " pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.278541 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.278594 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.278666 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: E0321 04:53:46.278762 4580 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:53:46 crc kubenswrapper[4580]: E0321 04:53:46.278873 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:47.278847265 +0000 UTC m=+132.361430893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:53:46 crc kubenswrapper[4580]: E0321 04:53:46.279176 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:53:46 crc kubenswrapper[4580]: E0321 04:53:46.279202 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:53:46 crc kubenswrapper[4580]: E0321 04:53:46.279221 4580 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:46 crc kubenswrapper[4580]: E0321 04:53:46.279297 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:47.279271734 +0000 UTC m=+132.361855362 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:46 crc kubenswrapper[4580]: E0321 04:53:46.279406 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:53:47.279390746 +0000 UTC m=+132.361974374 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:53:46 crc kubenswrapper[4580]: E0321 04:53:46.279495 4580 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:53:46 crc kubenswrapper[4580]: E0321 04:53:46.279549 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:47.27954023 +0000 UTC m=+132.362123858 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:53:46 crc kubenswrapper[4580]: E0321 04:53:46.279613 4580 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:53:46 crc kubenswrapper[4580]: E0321 04:53:46.279667 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs podName:ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:47.279655392 +0000 UTC m=+132.362239020 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs") pod "network-metrics-daemon-fpb6h" (UID: "ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:53:46 crc kubenswrapper[4580]: E0321 04:53:46.280140 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:53:46 crc kubenswrapper[4580]: E0321 04:53:46.280171 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:53:46 crc kubenswrapper[4580]: E0321 04:53:46.280188 4580 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:46 crc kubenswrapper[4580]: E0321 04:53:46.280242 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:47.280228494 +0000 UTC m=+132.362812122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.291859 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.306996 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.321190 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.335933 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.353401 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.376716 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.388013 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.401821 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.412489 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.424220 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:53:46 crc kubenswrapper[4580]: I0321 04:53:46.629566 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.236311 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f"} Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.256662 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.257326 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerStarted","Data":"20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b"} Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.257367 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerStarted","Data":"a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e"} Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.257386 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerStarted","Data":"b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41"} Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.257396 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerStarted","Data":"9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e"} Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.257405 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerStarted","Data":"52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee"} Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.257413 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerStarted","Data":"add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc"} Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.260297 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060"} Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.263004 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2"} Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.264627 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z5bcs" event={"ID":"f6761e28-8a0c-4ea2-b248-2bd60e3862e6","Type":"ContainerStarted","Data":"2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57"} Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.266520 4580 generic.go:334] "Generic (PLEG): container finished" podID="b0217ec0-8db1-4e76-bda0-e6299469b59c" containerID="b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775" exitCode=0 Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.266578 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" event={"ID":"b0217ec0-8db1-4e76-bda0-e6299469b59c","Type":"ContainerDied","Data":"b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775"} Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.268151 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qhc9t" event={"ID":"1c44c4ee-33a3-482b-b409-8ad89483790d","Type":"ContainerStarted","Data":"e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d"} Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.274354 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" event={"ID":"39312d7d-2530-4274-a347-e32998996270","Type":"ContainerStarted","Data":"44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d"} Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.274552 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" event={"ID":"39312d7d-2530-4274-a347-e32998996270","Type":"ContainerStarted","Data":"921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd"} Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.276760 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j7s9f" event={"ID":"ab49cffe-6918-451d-bbf0-8933c7395982","Type":"ContainerStarted","Data":"60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201"} Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.285430 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.290268 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.290511 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.290550 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:53:49.290505559 +0000 UTC m=+134.373089187 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.290723 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.290843 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs\") pod \"network-metrics-daemon-fpb6h\" (UID: \"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\") " pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.290918 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.290985 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.291166 4580 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.291278 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:49.291263316 +0000 UTC m=+134.373846944 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.290675 4580 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.291641 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:49.291631554 +0000 UTC m=+134.374215182 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.292258 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.292312 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.292271 4580 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.292415 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs podName:ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:49.29239193 +0000 UTC m=+134.374975558 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs") pod "network-metrics-daemon-fpb6h" (UID: "ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.292325 4580 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.292478 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:49.292470982 +0000 UTC m=+134.375054610 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.292564 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.292601 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.292614 4580 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.292647 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:49.292639286 +0000 UTC m=+134.375222904 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.307947 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.329133 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.348501 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.362112 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.378479 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.410167 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.430079 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.451863 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.469626 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.493477 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.517331 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.534870 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.550626 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.569224 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.594044 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.616675 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.617031 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.617154 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.617229 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.617263 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.617244 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.617363 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.617437 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:53:47 crc kubenswrapper[4580]: E0321 04:53:47.617504 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.622824 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.623996 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.625489 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.626358 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.627665 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.628366 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.629140 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.630416 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.631347 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.634502 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.635410 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.636381 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.637134 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.638866 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.639535 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.640263 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.640588 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.641582 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.642590 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.644071 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.644899 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.645566 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.646833 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.647382 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.648763 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.649422 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.650843 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.651586 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.652274 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.653337 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.653939 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.655006 4580 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.655143 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.656013 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.657231 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.658652 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.659286 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.661288 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.662201 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.663386 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.664265 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.665718 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.666378 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.667633 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.669050 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.669980 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.670869 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.671407 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.672581 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.673491 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.673676 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.674152 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.675227 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.675983 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.677052 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.677614 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.678117 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.693330 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.707260 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.724905 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.744764 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.760641 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.776215 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.789981 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.807095 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:47 crc kubenswrapper[4580]: I0321 04:53:47.819591 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.282455 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" event={"ID":"b0217ec0-8db1-4e76-bda0-e6299469b59c","Type":"ContainerStarted","Data":"98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe"} Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.318026 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.351559 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.386377 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.426627 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.449363 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.472340 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.490381 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.510730 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.532879 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.557808 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.590237 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.607931 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.628113 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.631031 4580 scope.go:117] "RemoveContainer" containerID="593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.631191 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:53:48 crc kubenswrapper[4580]: E0321 04:53:48.631280 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.648309 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:48 crc kubenswrapper[4580]: I0321 04:53:48.665295 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.288935 4580 generic.go:334] "Generic (PLEG): container finished" podID="b0217ec0-8db1-4e76-bda0-e6299469b59c" containerID="98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe" exitCode=0 Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.289190 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" event={"ID":"b0217ec0-8db1-4e76-bda0-e6299469b59c","Type":"ContainerDied","Data":"98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe"} Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.298281 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerStarted","Data":"df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd"} Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.307931 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.312375 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf"} Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.313170 4580 scope.go:117] "RemoveContainer" containerID="593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba" Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.313426 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.314201 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.314384 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:53:53.314360881 +0000 UTC m=+138.396944509 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.314460 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.314506 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs\") pod \"network-metrics-daemon-fpb6h\" (UID: \"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\") " pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.314545 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.314573 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.314607 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.314657 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.314686 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.314701 4580 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.314723 4580 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.314749 4580 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.314755 4580 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.314795 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.314830 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.314845 4580 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.314766 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:53.314745059 +0000 UTC m=+138.397328697 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.314903 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:53.314889172 +0000 UTC m=+138.397473000 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.314919 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:53.314912643 +0000 UTC m=+138.397496491 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.314937 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs podName:ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:53.314927973 +0000 UTC m=+138.397511831 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs") pod "network-metrics-daemon-fpb6h" (UID: "ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.314949 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:53.314943503 +0000 UTC m=+138.397527361 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.324291 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.336601 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.352433 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.372649 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.391198 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.407923 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.426078 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.440474 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.456745 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.471131 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.488389 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.502973 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.520471 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.544750 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.558765 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.575954 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.591366 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.606723 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.617079 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.617104 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.617179 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.617251 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.617488 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.617699 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.617776 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:53:49 crc kubenswrapper[4580]: E0321 04:53:49.617862 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.622384 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.636335 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.659329 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.670124 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.688666 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.704256 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.720530 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.735604 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.751548 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.767935 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.784581 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.802720 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:49 crc kubenswrapper[4580]: I0321 04:53:49.816934 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.317745 4580 generic.go:334] "Generic (PLEG): container finished" podID="b0217ec0-8db1-4e76-bda0-e6299469b59c" containerID="e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d" exitCode=0 Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.317826 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" event={"ID":"b0217ec0-8db1-4e76-bda0-e6299469b59c","Type":"ContainerDied","Data":"e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d"} Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.339828 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.358699 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.377539 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.393685 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.408757 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.423227 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.437935 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.453870 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.468068 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.493056 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.507015 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.522406 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.537035 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.548974 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.565226 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: I0321 04:53:50.581631 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:50Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:50 crc kubenswrapper[4580]: E0321 04:53:50.700369 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.324677 4580 generic.go:334] "Generic (PLEG): container finished" podID="b0217ec0-8db1-4e76-bda0-e6299469b59c" containerID="7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b" exitCode=0 Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.324747 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" event={"ID":"b0217ec0-8db1-4e76-bda0-e6299469b59c","Type":"ContainerDied","Data":"7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b"} Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.344743 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.361908 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.387325 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.409508 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.426479 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.441266 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.456694 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.474709 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.491893 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.503086 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.517020 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.528853 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.541437 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.555682 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.570899 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.585733 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:51Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.617426 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.617506 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:53:51 crc kubenswrapper[4580]: E0321 04:53:51.617596 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:53:51 crc kubenswrapper[4580]: E0321 04:53:51.617687 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.617773 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:53:51 crc kubenswrapper[4580]: E0321 04:53:51.617845 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:53:51 crc kubenswrapper[4580]: I0321 04:53:51.617902 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:51 crc kubenswrapper[4580]: E0321 04:53:51.617955 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:53:52 crc kubenswrapper[4580]: I0321 04:53:52.334191 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerStarted","Data":"9971f8c13e7fd35d23d734604c59a2c248e6cea303e94592982ed7964ec3050f"} Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.342634 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" event={"ID":"b0217ec0-8db1-4e76-bda0-e6299469b59c","Type":"ContainerStarted","Data":"88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109"} Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.343401 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.362033 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.370609 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.370857 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.371000 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:54:01.370950604 +0000 UTC m=+146.453534282 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.371050 4580 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.371071 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.371114 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs\") pod \"network-metrics-daemon-fpb6h\" (UID: \"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\") " pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.371147 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:54:01.371118718 +0000 UTC m=+146.453702366 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.371193 4580 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.371298 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.371321 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.371340 4580 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.371187 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.371439 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.371480 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.371506 4580 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.371720 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs podName:ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7 nodeName:}" failed. No retries permitted until 2026-03-21 04:54:01.37122275 +0000 UTC m=+146.453806378 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs") pod "network-metrics-daemon-fpb6h" (UID: "ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.371770 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:54:01.371759562 +0000 UTC m=+146.454343190 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.371827 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.371933 4580 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.371981 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:54:01.371952226 +0000 UTC m=+146.454536014 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.372023 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:54:01.372008607 +0000 UTC m=+146.454592435 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.389149 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.401805 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.416132 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.423773 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.429171 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.442798 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.464742 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.515760 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.538021 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.566004 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.579053 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.601558 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.613820 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.617059 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.617126 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.617168 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.617177 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.617201 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.617283 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.617329 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:53:53 crc kubenswrapper[4580]: E0321 04:53:53.617382 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.629524 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.644355 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.657770 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.671421 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.685383 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.699905 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.712355 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.723626 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.741513 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9971f8c13e7fd35d23d734604c59a2c248e6cea303e94592982ed7964ec3050f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.756180 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.769557 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.786104 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.800800 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.819232 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.840216 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.863367 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.879751 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.901107 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:53 crc kubenswrapper[4580]: I0321 04:53:53.915010 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.350326 4580 generic.go:334] "Generic (PLEG): container finished" podID="b0217ec0-8db1-4e76-bda0-e6299469b59c" containerID="88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109" exitCode=0 Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.350445 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" event={"ID":"b0217ec0-8db1-4e76-bda0-e6299469b59c","Type":"ContainerDied","Data":"88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109"} Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.351204 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.351262 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.369209 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.382654 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.393002 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.412047 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.428661 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.450454 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.469346 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.486013 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.501247 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.518234 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.532629 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.547330 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.561664 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.580730 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.595977 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.613631 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.633992 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9971f8c13e7fd35d23d734604c59a2c248e6cea303e94592982ed7964ec3050f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.651471 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.667462 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.668671 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.668712 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.668724 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.668743 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.668752 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:54Z","lastTransitionTime":"2026-03-21T04:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.680595 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: E0321 04:53:54.683403 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.693017 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.693058 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.693068 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.693082 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.693092 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:54Z","lastTransitionTime":"2026-03-21T04:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.697195 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: E0321 04:53:54.705918 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.710031 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.710381 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.710423 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.710436 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.710454 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.710466 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:54Z","lastTransitionTime":"2026-03-21T04:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.721922 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: E0321 04:53:54.723664 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.727086 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.727143 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.727153 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.727166 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.727174 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:54Z","lastTransitionTime":"2026-03-21T04:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.734462 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: E0321 04:53:54.739673 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.743323 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.743597 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.743710 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.743809 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.743889 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:53:54Z","lastTransitionTime":"2026-03-21T04:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.749249 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: E0321 04:53:54.758927 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: E0321 04:53:54.759051 4580 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.763177 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.780862 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.802971 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9971f8c13e7fd35d23d734604c59a2c248e6cea303e94592982ed7964ec3050f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.816574 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.834714 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.848638 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.860716 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:54 crc kubenswrapper[4580]: I0321 04:53:54.874481 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.357149 4580 generic.go:334] "Generic (PLEG): container finished" podID="b0217ec0-8db1-4e76-bda0-e6299469b59c" containerID="aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c" exitCode=0 Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.357221 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" event={"ID":"b0217ec0-8db1-4e76-bda0-e6299469b59c","Type":"ContainerDied","Data":"aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c"} Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.374143 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.388758 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.404970 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.423447 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.441228 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.461393 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9971f8c13e7fd35d23d734604c59a2c248e6cea303e94592982ed7964ec3050f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.482385 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.496702 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.515426 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.533659 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.546158 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.563322 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.581306 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.604989 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.619625 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:55 crc kubenswrapper[4580]: E0321 04:53:55.620365 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.619904 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.619873 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:53:55 crc kubenswrapper[4580]: E0321 04:53:55.620845 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.619939 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:53:55 crc kubenswrapper[4580]: E0321 04:53:55.621082 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:53:55 crc kubenswrapper[4580]: E0321 04:53:55.620971 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.623162 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.636620 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.651699 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.668528 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.684437 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.700768 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: E0321 04:53:55.701523 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.712676 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.738483 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.754197 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.771451 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.788068 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.800964 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.823918 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9971f8c13e7fd35d23d734604c59a2c248e6cea303e94592982ed7964ec3050f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.838803 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.854672 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.869960 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.881972 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:55 crc kubenswrapper[4580]: I0321 04:53:55.895173 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.364594 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" event={"ID":"b0217ec0-8db1-4e76-bda0-e6299469b59c","Type":"ContainerStarted","Data":"7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa"} Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.365879 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovnkube-controller/0.log" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.368412 4580 generic.go:334] "Generic (PLEG): container finished" podID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerID="9971f8c13e7fd35d23d734604c59a2c248e6cea303e94592982ed7964ec3050f" exitCode=1 Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.368597 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerDied","Data":"9971f8c13e7fd35d23d734604c59a2c248e6cea303e94592982ed7964ec3050f"} Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.369201 4580 scope.go:117] "RemoveContainer" containerID="9971f8c13e7fd35d23d734604c59a2c248e6cea303e94592982ed7964ec3050f" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.381624 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.397960 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.412410 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.431333 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.447158 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.460873 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.474488 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.487967 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.500568 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.515670 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.545965 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9971f8c13e7fd35d23d734604c59a2c248e6cea303e94592982ed7964ec3050f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.558558 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.577242 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.597381 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.612425 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.631720 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.659188 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.678185 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9971f8c13e7fd35d23d734604c59a2c248e6cea303e94592982ed7964ec3050f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9971f8c13e7fd35d23d734604c59a2c248e6cea303e94592982ed7964ec3050f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"message\\\":\\\"nt handler 8\\\\nI0321 04:53:55.616442 6365 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:53:55.616500 6365 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:53:55.616597 6365 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:53:55.616747 6365 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:53:55.616910 6365 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:53:55.616965 6365 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:53:55.617224 6365 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:53:55.617915 6365 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.688666 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.701085 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.711080 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.719981 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.730168 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.740707 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.751041 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.761993 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.775838 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.785344 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.797084 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.809376 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.822828 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:56 crc kubenswrapper[4580]: I0321 04:53:56.835701 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.374736 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovnkube-controller/0.log" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.380328 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerStarted","Data":"3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb"} Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.380424 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.399101 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.413308 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.437245 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9971f8c13e7fd35d23d734604c59a2c248e6cea303e94592982ed7964ec3050f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"message\\\":\\\"nt handler 8\\\\nI0321 04:53:55.616442 6365 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:53:55.616500 6365 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:53:55.616597 6365 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:53:55.616747 6365 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:53:55.616910 6365 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:53:55.616965 6365 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:53:55.617224 6365 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:53:55.617915 6365 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.450387 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.462846 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.478117 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.490977 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.505138 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.522634 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.538174 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.552452 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.619932 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:57 crc kubenswrapper[4580]: E0321 04:53:57.620072 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.620464 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:53:57 crc kubenswrapper[4580]: E0321 04:53:57.620535 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.620590 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:57 crc kubenswrapper[4580]: E0321 04:53:57.620647 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.620693 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:53:57 crc kubenswrapper[4580]: E0321 04:53:57.620747 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.797246 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.809251 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.820905 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.832672 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:57 crc kubenswrapper[4580]: I0321 04:53:57.857160 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.385383 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovnkube-controller/1.log" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.386295 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovnkube-controller/0.log" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.390185 4580 generic.go:334] "Generic (PLEG): container finished" podID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerID="3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb" exitCode=1 Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.390240 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerDied","Data":"3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb"} Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.390287 4580 scope.go:117] "RemoveContainer" containerID="9971f8c13e7fd35d23d734604c59a2c248e6cea303e94592982ed7964ec3050f" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.391554 4580 scope.go:117] "RemoveContainer" containerID="3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb" Mar 21 04:53:58 crc kubenswrapper[4580]: E0321 04:53:58.391715 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.408926 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.425683 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.449913 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9971f8c13e7fd35d23d734604c59a2c248e6cea303e94592982ed7964ec3050f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"message\\\":\\\"nt handler 8\\\\nI0321 04:53:55.616442 6365 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:53:55.616500 6365 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:53:55.616597 6365 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:53:55.616747 6365 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:53:55.616910 6365 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:53:55.616965 6365 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:53:55.617224 6365 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:53:55.617915 6365 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:53:58Z\\\",\\\"message\\\":\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0321 04:53:57.412086 6565 services_controller.go:356] Processing sync for service openshift-dns-operator/metrics for network=default\\\\nI0321 04:53:57.412133 6565 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI0321 04:53:57.412153 6565 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 4.244742ms\\\\nI0321 04:53:57.412170 6565 services_controller.go:356] Processing sync for service openshift-kube-apiserver-operator/metrics for network=default\\\\nF0321 04:53:57.412089 6565 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.463010 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.477297 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.491490 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.503415 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.514601 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.531803 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.542801 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.559702 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.577909 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.594299 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.611261 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.625974 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:58 crc kubenswrapper[4580]: I0321 04:53:58.641961 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:58Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.396604 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovnkube-controller/1.log" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.401835 4580 scope.go:117] "RemoveContainer" containerID="3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb" Mar 21 04:53:59 crc kubenswrapper[4580]: E0321 04:53:59.402013 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.422252 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.438315 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.459076 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:53:58Z\\\",\\\"message\\\":\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0321 04:53:57.412086 6565 services_controller.go:356] Processing sync for service openshift-dns-operator/metrics for network=default\\\\nI0321 04:53:57.412133 6565 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI0321 04:53:57.412153 6565 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 4.244742ms\\\\nI0321 04:53:57.412170 6565 services_controller.go:356] Processing sync for service openshift-kube-apiserver-operator/metrics for network=default\\\\nF0321 04:53:57.412089 6565 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.470659 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.485578 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.514321 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.531824 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.555304 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.579107 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.599511 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.617421 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.617494 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.617498 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:53:59 crc kubenswrapper[4580]: E0321 04:53:59.617656 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.618044 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:53:59 crc kubenswrapper[4580]: E0321 04:53:59.619667 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.619879 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.619942 4580 scope.go:117] "RemoveContainer" containerID="593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba" Mar 21 04:53:59 crc kubenswrapper[4580]: E0321 04:53:59.620031 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:53:59 crc kubenswrapper[4580]: E0321 04:53:59.620160 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:53:59 crc kubenswrapper[4580]: E0321 04:53:59.620403 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.640211 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.650966 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.663645 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.676105 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:53:59 crc kubenswrapper[4580]: I0321 04:53:59.689016 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:53:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:00 crc kubenswrapper[4580]: E0321 04:54:00.703264 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:54:01 crc kubenswrapper[4580]: I0321 04:54:01.436936 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:54:01 crc kubenswrapper[4580]: I0321 04:54:01.437080 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.437115 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:54:17.437091261 +0000 UTC m=+162.519674889 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:54:01 crc kubenswrapper[4580]: I0321 04:54:01.437152 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs\") pod \"network-metrics-daemon-fpb6h\" (UID: \"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\") " pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:01 crc kubenswrapper[4580]: I0321 04:54:01.437186 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:01 crc kubenswrapper[4580]: I0321 04:54:01.437210 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:01 crc kubenswrapper[4580]: I0321 04:54:01.437236 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.437245 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.437271 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.437286 4580 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.437332 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:54:17.437321406 +0000 UTC m=+162.519905034 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.437336 4580 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.437365 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:54:17.437357337 +0000 UTC m=+162.519940975 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.437372 4580 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.437405 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs podName:ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7 nodeName:}" failed. No retries permitted until 2026-03-21 04:54:17.437396007 +0000 UTC m=+162.519979645 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs") pod "network-metrics-daemon-fpb6h" (UID: "ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.437462 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.437473 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.437480 4580 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.437502 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:54:17.43749448 +0000 UTC m=+162.520078108 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.437532 4580 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.437549 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:54:17.437544581 +0000 UTC m=+162.520128209 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:54:01 crc kubenswrapper[4580]: I0321 04:54:01.617006 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:01 crc kubenswrapper[4580]: I0321 04:54:01.617102 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.617148 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:01 crc kubenswrapper[4580]: I0321 04:54:01.617096 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.617314 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.617299 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:01 crc kubenswrapper[4580]: I0321 04:54:01.617357 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:01 crc kubenswrapper[4580]: E0321 04:54:01.617438 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:03 crc kubenswrapper[4580]: I0321 04:54:03.617362 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:03 crc kubenswrapper[4580]: I0321 04:54:03.617445 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:03 crc kubenswrapper[4580]: E0321 04:54:03.618161 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:03 crc kubenswrapper[4580]: I0321 04:54:03.617693 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:03 crc kubenswrapper[4580]: I0321 04:54:03.617445 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:03 crc kubenswrapper[4580]: E0321 04:54:03.618364 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:03 crc kubenswrapper[4580]: E0321 04:54:03.618473 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:03 crc kubenswrapper[4580]: E0321 04:54:03.618608 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.065161 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.065232 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.065246 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.065266 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.065291 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:05Z","lastTransitionTime":"2026-03-21T04:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:05 crc kubenswrapper[4580]: E0321 04:54:05.082950 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.087469 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.087508 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.087519 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.087536 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.087548 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:05Z","lastTransitionTime":"2026-03-21T04:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:05 crc kubenswrapper[4580]: E0321 04:54:05.100769 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.105045 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.105099 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.105114 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.105133 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.105146 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:05Z","lastTransitionTime":"2026-03-21T04:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:05 crc kubenswrapper[4580]: E0321 04:54:05.118607 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.122625 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.122686 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.122701 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.122726 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.122742 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:05Z","lastTransitionTime":"2026-03-21T04:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:05 crc kubenswrapper[4580]: E0321 04:54:05.143933 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.152894 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.152942 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.152954 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.152976 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.152989 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:05Z","lastTransitionTime":"2026-03-21T04:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:05 crc kubenswrapper[4580]: E0321 04:54:05.165858 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: E0321 04:54:05.166011 4580 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.617861 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.617917 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.618094 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:05 crc kubenswrapper[4580]: E0321 04:54:05.618082 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.618231 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:05 crc kubenswrapper[4580]: E0321 04:54:05.618494 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:05 crc kubenswrapper[4580]: E0321 04:54:05.618695 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:05 crc kubenswrapper[4580]: E0321 04:54:05.618761 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.639025 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.655582 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.672335 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.690958 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.703040 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: E0321 04:54:05.704901 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.728557 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.744320 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.757466 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.770863 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.783527 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.805002 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:53:58Z\\\",\\\"message\\\":\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0321 04:53:57.412086 6565 services_controller.go:356] Processing sync for service openshift-dns-operator/metrics for network=default\\\\nI0321 04:53:57.412133 6565 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI0321 04:53:57.412153 6565 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 4.244742ms\\\\nI0321 04:53:57.412170 6565 services_controller.go:356] Processing sync for service openshift-kube-apiserver-operator/metrics for network=default\\\\nF0321 04:53:57.412089 6565 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.817171 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.835304 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.849906 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.862579 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:05 crc kubenswrapper[4580]: I0321 04:54:05.874887 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:07 crc kubenswrapper[4580]: I0321 04:54:07.618279 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:07 crc kubenswrapper[4580]: I0321 04:54:07.618316 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:07 crc kubenswrapper[4580]: E0321 04:54:07.618503 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:07 crc kubenswrapper[4580]: I0321 04:54:07.618569 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:07 crc kubenswrapper[4580]: E0321 04:54:07.618716 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:07 crc kubenswrapper[4580]: E0321 04:54:07.618848 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:07 crc kubenswrapper[4580]: I0321 04:54:07.619121 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:07 crc kubenswrapper[4580]: E0321 04:54:07.619240 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:09 crc kubenswrapper[4580]: I0321 04:54:09.616855 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:09 crc kubenswrapper[4580]: I0321 04:54:09.616906 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:09 crc kubenswrapper[4580]: I0321 04:54:09.616946 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:09 crc kubenswrapper[4580]: I0321 04:54:09.616979 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:09 crc kubenswrapper[4580]: E0321 04:54:09.617042 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:09 crc kubenswrapper[4580]: E0321 04:54:09.617178 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:09 crc kubenswrapper[4580]: E0321 04:54:09.617282 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:09 crc kubenswrapper[4580]: E0321 04:54:09.617383 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:10 crc kubenswrapper[4580]: E0321 04:54:10.706227 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:54:11 crc kubenswrapper[4580]: I0321 04:54:11.618058 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:11 crc kubenswrapper[4580]: I0321 04:54:11.618132 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:11 crc kubenswrapper[4580]: E0321 04:54:11.618363 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:11 crc kubenswrapper[4580]: I0321 04:54:11.618413 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:11 crc kubenswrapper[4580]: I0321 04:54:11.618391 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:11 crc kubenswrapper[4580]: E0321 04:54:11.619068 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:11 crc kubenswrapper[4580]: E0321 04:54:11.619198 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:11 crc kubenswrapper[4580]: I0321 04:54:11.619342 4580 scope.go:117] "RemoveContainer" containerID="3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb" Mar 21 04:54:11 crc kubenswrapper[4580]: E0321 04:54:11.619431 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.475538 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovnkube-controller/1.log" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.480478 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerStarted","Data":"40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8"} Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.481029 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.494428 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.509768 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.521916 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.536362 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.550190 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.572397 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:53:58Z\\\",\\\"message\\\":\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0321 04:53:57.412086 6565 services_controller.go:356] Processing sync for service openshift-dns-operator/metrics for network=default\\\\nI0321 04:53:57.412133 6565 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI0321 04:53:57.412153 6565 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 4.244742ms\\\\nI0321 04:53:57.412170 6565 services_controller.go:356] Processing sync for service openshift-kube-apiserver-operator/metrics for network=default\\\\nF0321 04:53:57.412089 6565 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:54:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.582700 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.595286 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.609022 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.619085 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.631190 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.646229 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.661102 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.675225 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.690992 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:12 crc kubenswrapper[4580]: I0321 04:54:12.702723 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.486657 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovnkube-controller/2.log" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.487569 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovnkube-controller/1.log" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.498642 4580 generic.go:334] "Generic (PLEG): container finished" podID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerID="40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8" exitCode=1 Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.498714 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerDied","Data":"40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8"} Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.499166 4580 scope.go:117] "RemoveContainer" containerID="3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.499711 4580 scope.go:117] "RemoveContainer" containerID="40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8" Mar 21 04:54:13 crc kubenswrapper[4580]: E0321 04:54:13.499990 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.518590 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.534933 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.550382 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.568846 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.583146 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.600866 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.617143 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.617226 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:13 crc kubenswrapper[4580]: E0321 04:54:13.617422 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.617447 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.617551 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.617721 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:13 crc kubenswrapper[4580]: E0321 04:54:13.618297 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:13 crc kubenswrapper[4580]: E0321 04:54:13.618384 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.618436 4580 scope.go:117] "RemoveContainer" containerID="593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba" Mar 21 04:54:13 crc kubenswrapper[4580]: E0321 04:54:13.618486 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:13 crc kubenswrapper[4580]: E0321 04:54:13.618652 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.632586 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.634276 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.647184 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.663335 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.683862 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d7a9e86374ece252480aaecf12d524faaacb8f6cb06d6ddb267bd9a1cfa31cb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:53:58Z\\\",\\\"message\\\":\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0321 04:53:57.412086 6565 services_controller.go:356] Processing sync for service openshift-dns-operator/metrics for network=default\\\\nI0321 04:53:57.412133 6565 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI0321 04:53:57.412153 6565 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 4.244742ms\\\\nI0321 04:53:57.412170 6565 services_controller.go:356] Processing sync for service openshift-kube-apiserver-operator/metrics for network=default\\\\nF0321 04:53:57.412089 6565 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:12Z\\\",\\\"message\\\":\\\"oller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0321 04:54:12.477044 6738 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:54:12.477211 6738 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}\\\\nI0321 04:54:12.477265 6738 services_controller.go:360] Finished syncing service controller-manager on namespace openshift-controller-manager for network=default : 3.68558ms\\\\nF0321 04:54:12.476586 6738 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:54:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.699989 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.718161 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.734766 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.745902 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:13 crc kubenswrapper[4580]: I0321 04:54:13.756686 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.505750 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovnkube-controller/2.log" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.509337 4580 scope.go:117] "RemoveContainer" containerID="40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8" Mar 21 04:54:14 crc kubenswrapper[4580]: E0321 04:54:14.509586 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.523091 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.537140 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.549393 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.562445 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.621641 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.638113 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.652443 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.666543 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.680873 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.692142 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e5c1013-1954-43fa-9340-02f5bc2176b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e34cce0b8b2645887d3fb044b916f5c1b348ed6ac52ff3d47b8c926223993d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.704631 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.718025 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.730148 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.742823 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.756010 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.777368 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:12Z\\\",\\\"message\\\":\\\"oller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0321 04:54:12.477044 6738 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:54:12.477211 6738 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}\\\\nI0321 04:54:12.477265 6738 services_controller.go:360] Finished syncing service controller-manager on namespace openshift-controller-manager for network=default : 3.68558ms\\\\nF0321 04:54:12.476586 6738 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:54:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:14 crc kubenswrapper[4580]: I0321 04:54:14.791013 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.453857 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.453899 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.453908 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.453926 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.453936 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:15Z","lastTransitionTime":"2026-03-21T04:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:15 crc kubenswrapper[4580]: E0321 04:54:15.466867 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.471452 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.471506 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.471516 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.471533 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.471544 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:15Z","lastTransitionTime":"2026-03-21T04:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:15 crc kubenswrapper[4580]: E0321 04:54:15.484249 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.488989 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.489076 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.489088 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.489108 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.489122 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:15Z","lastTransitionTime":"2026-03-21T04:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:15 crc kubenswrapper[4580]: E0321 04:54:15.502294 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.506702 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.506748 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.506765 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.506805 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.506823 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:15Z","lastTransitionTime":"2026-03-21T04:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:15 crc kubenswrapper[4580]: E0321 04:54:15.520309 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.524459 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.524489 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.524499 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.524517 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.524529 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:15Z","lastTransitionTime":"2026-03-21T04:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:15 crc kubenswrapper[4580]: E0321 04:54:15.537190 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: E0321 04:54:15.537361 4580 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.617560 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.617741 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:15 crc kubenswrapper[4580]: E0321 04:54:15.617755 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:15 crc kubenswrapper[4580]: E0321 04:54:15.617828 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.617859 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:15 crc kubenswrapper[4580]: E0321 04:54:15.617908 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.617925 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:15 crc kubenswrapper[4580]: E0321 04:54:15.617964 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.633160 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.657767 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:12Z\\\",\\\"message\\\":\\\"oller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0321 04:54:12.477044 6738 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:54:12.477211 6738 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}\\\\nI0321 04:54:12.477265 6738 services_controller.go:360] Finished syncing service controller-manager on namespace openshift-controller-manager for network=default : 3.68558ms\\\\nF0321 04:54:12.476586 6738 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:54:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.670482 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.688275 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.705763 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: E0321 04:54:15.707713 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.722644 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.741219 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.755697 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.771035 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.785933 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.800585 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.813110 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.828293 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.839203 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e5c1013-1954-43fa-9340-02f5bc2176b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e34cce0b8b2645887d3fb044b916f5c1b348ed6ac52ff3d47b8c926223993d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.851690 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.864076 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:15 crc kubenswrapper[4580]: I0321 04:54:15.886909 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:17 crc kubenswrapper[4580]: I0321 04:54:17.528273 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:54:17 crc kubenswrapper[4580]: I0321 04:54:17.528391 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:17 crc kubenswrapper[4580]: I0321 04:54:17.528429 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs\") pod \"network-metrics-daemon-fpb6h\" (UID: \"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\") " pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.528559 4580 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.528579 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:54:49.5285267 +0000 UTC m=+194.611110338 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.528630 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs podName:ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7 nodeName:}" failed. No retries permitted until 2026-03-21 04:54:49.528616592 +0000 UTC m=+194.611200450 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs") pod "network-metrics-daemon-fpb6h" (UID: "ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.528722 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:54:17 crc kubenswrapper[4580]: I0321 04:54:17.528739 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.528766 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.528803 4580 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:54:17 crc kubenswrapper[4580]: I0321 04:54:17.528805 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:17 crc kubenswrapper[4580]: I0321 04:54:17.528866 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.528902 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:54:49.528872457 +0000 UTC m=+194.611456265 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.528980 4580 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.528994 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.529015 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.529026 4580 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.529034 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:54:49.52901349 +0000 UTC m=+194.611597308 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.529094 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:54:49.529083792 +0000 UTC m=+194.611667660 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.529148 4580 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.529221 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:54:49.529209205 +0000 UTC m=+194.611793063 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:54:17 crc kubenswrapper[4580]: I0321 04:54:17.617436 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:17 crc kubenswrapper[4580]: I0321 04:54:17.617499 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:17 crc kubenswrapper[4580]: I0321 04:54:17.617519 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.617612 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.617554 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:17 crc kubenswrapper[4580]: I0321 04:54:17.617559 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.617731 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:17 crc kubenswrapper[4580]: E0321 04:54:17.617763 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:19 crc kubenswrapper[4580]: I0321 04:54:19.617414 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:19 crc kubenswrapper[4580]: I0321 04:54:19.617483 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:19 crc kubenswrapper[4580]: I0321 04:54:19.617514 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:19 crc kubenswrapper[4580]: I0321 04:54:19.617635 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:19 crc kubenswrapper[4580]: E0321 04:54:19.617619 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:19 crc kubenswrapper[4580]: E0321 04:54:19.617802 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:19 crc kubenswrapper[4580]: E0321 04:54:19.617912 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:19 crc kubenswrapper[4580]: E0321 04:54:19.618033 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:20 crc kubenswrapper[4580]: E0321 04:54:20.708744 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:54:21 crc kubenswrapper[4580]: I0321 04:54:21.617753 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:21 crc kubenswrapper[4580]: I0321 04:54:21.617768 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:21 crc kubenswrapper[4580]: E0321 04:54:21.618024 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:21 crc kubenswrapper[4580]: I0321 04:54:21.617833 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:21 crc kubenswrapper[4580]: I0321 04:54:21.617838 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:21 crc kubenswrapper[4580]: E0321 04:54:21.618150 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:21 crc kubenswrapper[4580]: E0321 04:54:21.618366 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:21 crc kubenswrapper[4580]: E0321 04:54:21.618393 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:23 crc kubenswrapper[4580]: I0321 04:54:23.617854 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:23 crc kubenswrapper[4580]: I0321 04:54:23.617892 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:23 crc kubenswrapper[4580]: E0321 04:54:23.618053 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:23 crc kubenswrapper[4580]: I0321 04:54:23.618093 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:23 crc kubenswrapper[4580]: I0321 04:54:23.618146 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:23 crc kubenswrapper[4580]: E0321 04:54:23.618282 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:23 crc kubenswrapper[4580]: E0321 04:54:23.618435 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:23 crc kubenswrapper[4580]: E0321 04:54:23.618516 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.617403 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.617482 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.617504 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.617404 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:25 crc kubenswrapper[4580]: E0321 04:54:25.617565 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:25 crc kubenswrapper[4580]: E0321 04:54:25.617635 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:25 crc kubenswrapper[4580]: E0321 04:54:25.617827 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:25 crc kubenswrapper[4580]: E0321 04:54:25.617966 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.637229 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.653493 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.672272 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:12Z\\\",\\\"message\\\":\\\"oller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0321 04:54:12.477044 6738 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:54:12.477211 6738 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}\\\\nI0321 04:54:12.477265 6738 services_controller.go:360] Finished syncing service controller-manager on namespace openshift-controller-manager for network=default : 3.68558ms\\\\nF0321 04:54:12.476586 6738 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:54:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.686763 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.703088 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: E0321 04:54:25.711764 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.725682 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.746259 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.763410 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.783635 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.798447 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.798828 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.798933 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.799037 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.799132 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:25Z","lastTransitionTime":"2026-03-21T04:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.804743 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: E0321 04:54:25.815653 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.820076 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.820125 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.820138 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.820155 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.820170 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:25Z","lastTransitionTime":"2026-03-21T04:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.822026 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: E0321 04:54:25.837253 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.841542 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.843540 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.843591 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.843603 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.843624 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.843635 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:25Z","lastTransitionTime":"2026-03-21T04:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.855072 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: E0321 04:54:25.858254 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.862870 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.862947 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.862963 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.862982 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.862996 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:25Z","lastTransitionTime":"2026-03-21T04:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.866310 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e5c1013-1954-43fa-9340-02f5bc2176b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e34cce0b8b2645887d3fb044b916f5c1b348ed6ac52ff3d47b8c926223993d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: E0321 04:54:25.875316 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.877681 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.878743 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.878766 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.878775 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.878810 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.878821 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:25Z","lastTransitionTime":"2026-03-21T04:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.890834 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: E0321 04:54:25.890743 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:25 crc kubenswrapper[4580]: E0321 04:54:25.891369 4580 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:54:25 crc kubenswrapper[4580]: I0321 04:54:25.905500 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:27 crc kubenswrapper[4580]: I0321 04:54:27.618060 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:27 crc kubenswrapper[4580]: I0321 04:54:27.618181 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:27 crc kubenswrapper[4580]: I0321 04:54:27.618242 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:27 crc kubenswrapper[4580]: E0321 04:54:27.619083 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:27 crc kubenswrapper[4580]: I0321 04:54:27.618364 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:27 crc kubenswrapper[4580]: E0321 04:54:27.619315 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:27 crc kubenswrapper[4580]: E0321 04:54:27.619360 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:27 crc kubenswrapper[4580]: E0321 04:54:27.619379 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:27 crc kubenswrapper[4580]: I0321 04:54:27.620019 4580 scope.go:117] "RemoveContainer" containerID="593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba" Mar 21 04:54:27 crc kubenswrapper[4580]: E0321 04:54:27.620292 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:54:28 crc kubenswrapper[4580]: I0321 04:54:28.635344 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 21 04:54:29 crc kubenswrapper[4580]: I0321 04:54:29.617606 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:29 crc kubenswrapper[4580]: I0321 04:54:29.617657 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:29 crc kubenswrapper[4580]: I0321 04:54:29.617633 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:29 crc kubenswrapper[4580]: I0321 04:54:29.617837 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:29 crc kubenswrapper[4580]: E0321 04:54:29.617827 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:29 crc kubenswrapper[4580]: E0321 04:54:29.618026 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:29 crc kubenswrapper[4580]: E0321 04:54:29.618086 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:29 crc kubenswrapper[4580]: E0321 04:54:29.618173 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:29 crc kubenswrapper[4580]: I0321 04:54:29.619003 4580 scope.go:117] "RemoveContainer" containerID="40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8" Mar 21 04:54:29 crc kubenswrapper[4580]: E0321 04:54:29.619203 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" Mar 21 04:54:30 crc kubenswrapper[4580]: E0321 04:54:30.713502 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:54:31 crc kubenswrapper[4580]: I0321 04:54:31.617677 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:31 crc kubenswrapper[4580]: E0321 04:54:31.617903 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:31 crc kubenswrapper[4580]: I0321 04:54:31.618055 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:31 crc kubenswrapper[4580]: E0321 04:54:31.618214 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:31 crc kubenswrapper[4580]: I0321 04:54:31.618234 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:31 crc kubenswrapper[4580]: I0321 04:54:31.618270 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:31 crc kubenswrapper[4580]: E0321 04:54:31.618330 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:31 crc kubenswrapper[4580]: E0321 04:54:31.618388 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.577727 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z5bcs_f6761e28-8a0c-4ea2-b248-2bd60e3862e6/kube-multus/0.log" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.578421 4580 generic.go:334] "Generic (PLEG): container finished" podID="f6761e28-8a0c-4ea2-b248-2bd60e3862e6" containerID="2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57" exitCode=1 Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.578507 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z5bcs" event={"ID":"f6761e28-8a0c-4ea2-b248-2bd60e3862e6","Type":"ContainerDied","Data":"2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57"} Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.579225 4580 scope.go:117] "RemoveContainer" containerID="2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.606348 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.617254 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:33 crc kubenswrapper[4580]: E0321 04:54:33.617431 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.617674 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:33 crc kubenswrapper[4580]: E0321 04:54:33.617754 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.617931 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:33 crc kubenswrapper[4580]: E0321 04:54:33.618010 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.618157 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:33 crc kubenswrapper[4580]: E0321 04:54:33.618237 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.626821 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.650418 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:12Z\\\",\\\"message\\\":\\\"oller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0321 04:54:12.477044 6738 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:54:12.477211 6738 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}\\\\nI0321 04:54:12.477265 6738 services_controller.go:360] Finished syncing service controller-manager on namespace openshift-controller-manager for network=default : 3.68558ms\\\\nF0321 04:54:12.476586 6738 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:54:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.662404 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.680858 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.700098 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.717238 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.734617 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.758299 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.775804 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.796613 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:33Z\\\",\\\"message\\\":\\\"2026-03-21T04:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654\\\\n2026-03-21T04:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654 to /host/opt/cni/bin/\\\\n2026-03-21T04:53:48Z [verbose] multus-daemon started\\\\n2026-03-21T04:53:48Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:54:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.815278 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.827363 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.849875 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"298538c0-bbe0-440c-a0a0-d166812f1ba6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25e55cf6b1c2feb181c0c5139f51f295f900960f5d429a616ebe01bed366c67c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c903f61cf0e6b078fa36787397aa50d7244f2f58db68e6eaab165f0b44dacd7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d40733302e6914f45c270751c177d3b83a20ff24a950b73a289fcf7abadd3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c8717bf5a92472b2134747e609ccf93c68e68a2021d51be80e4ca5aae8042e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a719c23dcc7675e21e14902c6fe2ef4c18bc715a610599abbe9f2b775da9c25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.866366 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e5c1013-1954-43fa-9340-02f5bc2176b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e34cce0b8b2645887d3fb044b916f5c1b348ed6ac52ff3d47b8c926223993d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.879379 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.896147 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:33 crc kubenswrapper[4580]: I0321 04:54:33.913164 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.584331 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z5bcs_f6761e28-8a0c-4ea2-b248-2bd60e3862e6/kube-multus/0.log" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.584406 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z5bcs" event={"ID":"f6761e28-8a0c-4ea2-b248-2bd60e3862e6","Type":"ContainerStarted","Data":"54c6c6747eb760f0735d1d9a95c1a7a436737adc6bcf6c2f4ecae6e770b8f6b8"} Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.601364 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.626728 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:12Z\\\",\\\"message\\\":\\\"oller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0321 04:54:12.477044 6738 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:54:12.477211 6738 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}\\\\nI0321 04:54:12.477265 6738 services_controller.go:360] Finished syncing service controller-manager on namespace openshift-controller-manager for network=default : 3.68558ms\\\\nF0321 04:54:12.476586 6738 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:54:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.640359 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.657654 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.673980 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.688439 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.703006 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.722774 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.740162 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.755775 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c6c6747eb760f0735d1d9a95c1a7a436737adc6bcf6c2f4ecae6e770b8f6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:33Z\\\",\\\"message\\\":\\\"2026-03-21T04:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654\\\\n2026-03-21T04:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654 to /host/opt/cni/bin/\\\\n2026-03-21T04:53:48Z [verbose] multus-daemon started\\\\n2026-03-21T04:53:48Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:54:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.772689 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.786173 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.801990 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.818197 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e5c1013-1954-43fa-9340-02f5bc2176b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e34cce0b8b2645887d3fb044b916f5c1b348ed6ac52ff3d47b8c926223993d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.832320 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.848738 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.864442 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:34 crc kubenswrapper[4580]: I0321 04:54:34.893311 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"298538c0-bbe0-440c-a0a0-d166812f1ba6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25e55cf6b1c2feb181c0c5139f51f295f900960f5d429a616ebe01bed366c67c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c903f61cf0e6b078fa36787397aa50d7244f2f58db68e6eaab165f0b44dacd7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d40733302e6914f45c270751c177d3b83a20ff24a950b73a289fcf7abadd3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c8717bf5a92472b2134747e609ccf93c68e68a2021d51be80e4ca5aae8042e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a719c23dcc7675e21e14902c6fe2ef4c18bc715a610599abbe9f2b775da9c25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.617961 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.618123 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.618145 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.618181 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:35 crc kubenswrapper[4580]: E0321 04:54:35.618698 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:35 crc kubenswrapper[4580]: E0321 04:54:35.618894 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:35 crc kubenswrapper[4580]: E0321 04:54:35.618982 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:35 crc kubenswrapper[4580]: E0321 04:54:35.619092 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.633666 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.650561 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.663638 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.701728 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: E0321 04:54:35.715368 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.716186 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.733112 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.748946 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c6c6747eb760f0735d1d9a95c1a7a436737adc6bcf6c2f4ecae6e770b8f6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:33Z\\\",\\\"message\\\":\\\"2026-03-21T04:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654\\\\n2026-03-21T04:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654 to /host/opt/cni/bin/\\\\n2026-03-21T04:53:48Z [verbose] multus-daemon started\\\\n2026-03-21T04:53:48Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:54:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.769297 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.783214 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.804912 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"298538c0-bbe0-440c-a0a0-d166812f1ba6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25e55cf6b1c2feb181c0c5139f51f295f900960f5d429a616ebe01bed366c67c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c903f61cf0e6b078fa36787397aa50d7244f2f58db68e6eaab165f0b44dacd7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d40733302e6914f45c270751c177d3b83a20ff24a950b73a289fcf7abadd3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c8717bf5a92472b2134747e609ccf93c68e68a2021d51be80e4ca5aae8042e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a719c23dcc7675e21e14902c6fe2ef4c18bc715a610599abbe9f2b775da9c25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.818656 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e5c1013-1954-43fa-9340-02f5bc2176b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e34cce0b8b2645887d3fb044b916f5c1b348ed6ac52ff3d47b8c926223993d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.831850 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.845547 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.859756 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.875697 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.891159 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.913603 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:12Z\\\",\\\"message\\\":\\\"oller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0321 04:54:12.477044 6738 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:54:12.477211 6738 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}\\\\nI0321 04:54:12.477265 6738 services_controller.go:360] Finished syncing service controller-manager on namespace openshift-controller-manager for network=default : 3.68558ms\\\\nF0321 04:54:12.476586 6738 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:54:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:35 crc kubenswrapper[4580]: I0321 04:54:35.931457 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.129336 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.129386 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.129400 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.129416 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.129427 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:36Z","lastTransitionTime":"2026-03-21T04:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:36 crc kubenswrapper[4580]: E0321 04:54:36.142733 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.146501 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.146551 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.146563 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.146589 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.146628 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:36Z","lastTransitionTime":"2026-03-21T04:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:36 crc kubenswrapper[4580]: E0321 04:54:36.160385 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.165094 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.165133 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.165143 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.165160 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.165174 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:36Z","lastTransitionTime":"2026-03-21T04:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:36 crc kubenswrapper[4580]: E0321 04:54:36.178576 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.182307 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.182345 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.182358 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.182374 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.182385 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:36Z","lastTransitionTime":"2026-03-21T04:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:36 crc kubenswrapper[4580]: E0321 04:54:36.196178 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.201193 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.201242 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.201253 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.201271 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:36 crc kubenswrapper[4580]: I0321 04:54:36.201281 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:36Z","lastTransitionTime":"2026-03-21T04:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:36 crc kubenswrapper[4580]: E0321 04:54:36.213627 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:36 crc kubenswrapper[4580]: E0321 04:54:36.213755 4580 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:54:37 crc kubenswrapper[4580]: I0321 04:54:37.618152 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:37 crc kubenswrapper[4580]: I0321 04:54:37.618225 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:37 crc kubenswrapper[4580]: I0321 04:54:37.618267 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:37 crc kubenswrapper[4580]: E0321 04:54:37.618315 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:37 crc kubenswrapper[4580]: I0321 04:54:37.618170 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:37 crc kubenswrapper[4580]: E0321 04:54:37.618448 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:37 crc kubenswrapper[4580]: E0321 04:54:37.618536 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:37 crc kubenswrapper[4580]: E0321 04:54:37.618599 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:39 crc kubenswrapper[4580]: I0321 04:54:39.617501 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:39 crc kubenswrapper[4580]: I0321 04:54:39.617500 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:39 crc kubenswrapper[4580]: I0321 04:54:39.617684 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:39 crc kubenswrapper[4580]: I0321 04:54:39.617864 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:39 crc kubenswrapper[4580]: E0321 04:54:39.617953 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:39 crc kubenswrapper[4580]: E0321 04:54:39.617852 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:39 crc kubenswrapper[4580]: E0321 04:54:39.618037 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:39 crc kubenswrapper[4580]: E0321 04:54:39.618097 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:39 crc kubenswrapper[4580]: I0321 04:54:39.618126 4580 scope.go:117] "RemoveContainer" containerID="593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba" Mar 21 04:54:39 crc kubenswrapper[4580]: E0321 04:54:39.618327 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:54:40 crc kubenswrapper[4580]: E0321 04:54:40.717322 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:54:41 crc kubenswrapper[4580]: I0321 04:54:41.617455 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:41 crc kubenswrapper[4580]: E0321 04:54:41.617621 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:41 crc kubenswrapper[4580]: I0321 04:54:41.617926 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:41 crc kubenswrapper[4580]: I0321 04:54:41.617936 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:41 crc kubenswrapper[4580]: I0321 04:54:41.618135 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:41 crc kubenswrapper[4580]: E0321 04:54:41.618013 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:41 crc kubenswrapper[4580]: E0321 04:54:41.618306 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:41 crc kubenswrapper[4580]: E0321 04:54:41.618488 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:43 crc kubenswrapper[4580]: I0321 04:54:43.617432 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:43 crc kubenswrapper[4580]: I0321 04:54:43.617432 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:43 crc kubenswrapper[4580]: E0321 04:54:43.618273 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:43 crc kubenswrapper[4580]: I0321 04:54:43.617538 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:43 crc kubenswrapper[4580]: E0321 04:54:43.618371 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:43 crc kubenswrapper[4580]: I0321 04:54:43.617502 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:43 crc kubenswrapper[4580]: E0321 04:54:43.618181 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:43 crc kubenswrapper[4580]: E0321 04:54:43.618433 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:44 crc kubenswrapper[4580]: I0321 04:54:44.617481 4580 scope.go:117] "RemoveContainer" containerID="40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.617649 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:45 crc kubenswrapper[4580]: E0321 04:54:45.618364 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.617717 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:45 crc kubenswrapper[4580]: E0321 04:54:45.618441 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.617683 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.618019 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:45 crc kubenswrapper[4580]: E0321 04:54:45.618507 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:45 crc kubenswrapper[4580]: E0321 04:54:45.618630 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.624986 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovnkube-controller/2.log" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.628077 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerStarted","Data":"b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d"} Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.632911 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.647807 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.662155 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c6c6747eb760f0735d1d9a95c1a7a436737adc6bcf6c2f4ecae6e770b8f6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:33Z\\\",\\\"message\\\":\\\"2026-03-21T04:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654\\\\n2026-03-21T04:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654 to /host/opt/cni/bin/\\\\n2026-03-21T04:53:48Z [verbose] multus-daemon started\\\\n2026-03-21T04:53:48Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:54:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.679618 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.693329 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.716037 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"298538c0-bbe0-440c-a0a0-d166812f1ba6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25e55cf6b1c2feb181c0c5139f51f295f900960f5d429a616ebe01bed366c67c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c903f61cf0e6b078fa36787397aa50d7244f2f58db68e6eaab165f0b44dacd7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d40733302e6914f45c270751c177d3b83a20ff24a950b73a289fcf7abadd3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c8717bf5a92472b2134747e609ccf93c68e68a2021d51be80e4ca5aae8042e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a719c23dcc7675e21e14902c6fe2ef4c18bc715a610599abbe9f2b775da9c25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: E0321 04:54:45.718621 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.732336 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e5c1013-1954-43fa-9340-02f5bc2176b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e34cce0b8b2645887d3fb044b916f5c1b348ed6ac52ff3d47b8c926223993d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.758033 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.785648 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.809499 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.826871 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.839956 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.858850 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:12Z\\\",\\\"message\\\":\\\"oller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0321 04:54:12.477044 6738 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:54:12.477211 6738 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}\\\\nI0321 04:54:12.477265 6738 services_controller.go:360] Finished syncing service controller-manager on namespace openshift-controller-manager for network=default : 3.68558ms\\\\nF0321 04:54:12.476586 6738 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:54:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.870128 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.883145 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.897975 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.908572 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:45 crc kubenswrapper[4580]: I0321 04:54:45.920831 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.346654 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.346708 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.346723 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.346748 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.346763 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:46Z","lastTransitionTime":"2026-03-21T04:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:46 crc kubenswrapper[4580]: E0321 04:54:46.361641 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.366829 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.366885 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.366897 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.366917 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.366931 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:46Z","lastTransitionTime":"2026-03-21T04:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:46 crc kubenswrapper[4580]: E0321 04:54:46.384015 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.390226 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.390285 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.390298 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.390320 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.390334 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:46Z","lastTransitionTime":"2026-03-21T04:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:46 crc kubenswrapper[4580]: E0321 04:54:46.418040 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.423480 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.423531 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.423546 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.423561 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.423576 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:46Z","lastTransitionTime":"2026-03-21T04:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:46 crc kubenswrapper[4580]: E0321 04:54:46.437560 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.442665 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.442709 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.442722 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.442741 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.442753 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:46Z","lastTransitionTime":"2026-03-21T04:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:46 crc kubenswrapper[4580]: E0321 04:54:46.458807 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: E0321 04:54:46.459424 4580 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.633353 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovnkube-controller/3.log" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.635498 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovnkube-controller/2.log" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.638556 4580 generic.go:334] "Generic (PLEG): container finished" podID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerID="b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d" exitCode=1 Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.638600 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerDied","Data":"b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d"} Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.638648 4580 scope.go:117] "RemoveContainer" containerID="40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.639555 4580 scope.go:117] "RemoveContainer" containerID="b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d" Mar 21 04:54:46 crc kubenswrapper[4580]: E0321 04:54:46.639822 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.654522 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.668343 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.682764 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.696679 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.709961 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.723537 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c6c6747eb760f0735d1d9a95c1a7a436737adc6bcf6c2f4ecae6e770b8f6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:33Z\\\",\\\"message\\\":\\\"2026-03-21T04:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654\\\\n2026-03-21T04:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654 to /host/opt/cni/bin/\\\\n2026-03-21T04:53:48Z [verbose] multus-daemon started\\\\n2026-03-21T04:53:48Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:54:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.740969 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.758361 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.774517 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.787481 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.802415 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.817924 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.842572 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"298538c0-bbe0-440c-a0a0-d166812f1ba6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25e55cf6b1c2feb181c0c5139f51f295f900960f5d429a616ebe01bed366c67c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c903f61cf0e6b078fa36787397aa50d7244f2f58db68e6eaab165f0b44dacd7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d40733302e6914f45c270751c177d3b83a20ff24a950b73a289fcf7abadd3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c8717bf5a92472b2134747e609ccf93c68e68a2021d51be80e4ca5aae8042e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a719c23dcc7675e21e14902c6fe2ef4c18bc715a610599abbe9f2b775da9c25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.855622 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e5c1013-1954-43fa-9340-02f5bc2176b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e34cce0b8b2645887d3fb044b916f5c1b348ed6ac52ff3d47b8c926223993d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.877116 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40de2ee4c95d869c71a8ddad2910de97779a4be261d57d2e03c308af6b95c9b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:12Z\\\",\\\"message\\\":\\\"oller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0321 04:54:12.477044 6738 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:54:12.477211 6738 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}\\\\nI0321 04:54:12.477265 6738 services_controller.go:360] Finished syncing service controller-manager on namespace openshift-controller-manager for network=default : 3.68558ms\\\\nF0321 04:54:12.476586 6738 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:54:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"flector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.154561 7061 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.154599 7061 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.155026 7061 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.164661 7061 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 04:54:46.164717 7061 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:54:46.164760 7061 factory.go:656] Stopping watch factory\\\\nI0321 04:54:46.164806 7061 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:54:46.164816 7061 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:54:46.175857 7061 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0321 04:54:46.175901 7061 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0321 04:54:46.175974 7061 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:54:46.176003 7061 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:54:46.176112 7061 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:54:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.889730 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.903761 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:46 crc kubenswrapper[4580]: I0321 04:54:46.918700 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.617443 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:47 crc kubenswrapper[4580]: E0321 04:54:47.617656 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.618010 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:47 crc kubenswrapper[4580]: E0321 04:54:47.618101 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.618098 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:47 crc kubenswrapper[4580]: E0321 04:54:47.618208 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.618283 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:47 crc kubenswrapper[4580]: E0321 04:54:47.618499 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.644581 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovnkube-controller/3.log" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.648656 4580 scope.go:117] "RemoveContainer" containerID="b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d" Mar 21 04:54:47 crc kubenswrapper[4580]: E0321 04:54:47.648876 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.665186 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.678718 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.694274 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.707531 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.723596 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.737456 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.750617 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.766374 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.782122 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c6c6747eb760f0735d1d9a95c1a7a436737adc6bcf6c2f4ecae6e770b8f6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:33Z\\\",\\\"message\\\":\\\"2026-03-21T04:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654\\\\n2026-03-21T04:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654 to /host/opt/cni/bin/\\\\n2026-03-21T04:53:48Z [verbose] multus-daemon started\\\\n2026-03-21T04:53:48Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:54:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.797252 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.820859 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"298538c0-bbe0-440c-a0a0-d166812f1ba6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25e55cf6b1c2feb181c0c5139f51f295f900960f5d429a616ebe01bed366c67c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c903f61cf0e6b078fa36787397aa50d7244f2f58db68e6eaab165f0b44dacd7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d40733302e6914f45c270751c177d3b83a20ff24a950b73a289fcf7abadd3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c8717bf5a92472b2134747e609ccf93c68e68a2021d51be80e4ca5aae8042e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a719c23dcc7675e21e14902c6fe2ef4c18bc715a610599abbe9f2b775da9c25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.834857 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e5c1013-1954-43fa-9340-02f5bc2176b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e34cce0b8b2645887d3fb044b916f5c1b348ed6ac52ff3d47b8c926223993d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.848710 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.863711 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.880440 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.894911 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.915566 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"flector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.154561 7061 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.154599 7061 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.155026 7061 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.164661 7061 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 04:54:46.164717 7061 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:54:46.164760 7061 factory.go:656] Stopping watch factory\\\\nI0321 04:54:46.164806 7061 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:54:46.164816 7061 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:54:46.175857 7061 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0321 04:54:46.175901 7061 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0321 04:54:46.175974 7061 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:54:46.176003 7061 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:54:46.176112 7061 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:54:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:47 crc kubenswrapper[4580]: I0321 04:54:47.927260 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:49 crc kubenswrapper[4580]: I0321 04:54:49.579415 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:54:49 crc kubenswrapper[4580]: I0321 04:54:49.579577 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs\") pod \"network-metrics-daemon-fpb6h\" (UID: \"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\") " pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.579691 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:53.579647129 +0000 UTC m=+258.662230757 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.579713 4580 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:54:49 crc kubenswrapper[4580]: I0321 04:54:49.579766 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.579803 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs podName:ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7 nodeName:}" failed. No retries permitted until 2026-03-21 04:55:53.579767372 +0000 UTC m=+258.662351000 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs") pod "network-metrics-daemon-fpb6h" (UID: "ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:54:49 crc kubenswrapper[4580]: I0321 04:54:49.579822 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:49 crc kubenswrapper[4580]: I0321 04:54:49.579853 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:49 crc kubenswrapper[4580]: I0321 04:54:49.579886 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.579983 4580 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.580007 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.580020 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.580038 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.580054 4580 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.580062 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:55:53.580040569 +0000 UTC m=+258.662624197 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.580084 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:55:53.58007559 +0000 UTC m=+258.662659218 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.580022 4580 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.580103 4580 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.580133 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:55:53.580126791 +0000 UTC m=+258.662710419 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.580020 4580 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.580172 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:55:53.580162522 +0000 UTC m=+258.662746150 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:54:49 crc kubenswrapper[4580]: I0321 04:54:49.618058 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:49 crc kubenswrapper[4580]: I0321 04:54:49.618125 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.618207 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.618283 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:49 crc kubenswrapper[4580]: I0321 04:54:49.618362 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.618410 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:49 crc kubenswrapper[4580]: I0321 04:54:49.618485 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:49 crc kubenswrapper[4580]: E0321 04:54:49.618546 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:50 crc kubenswrapper[4580]: E0321 04:54:50.720999 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:54:51 crc kubenswrapper[4580]: I0321 04:54:51.617130 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:51 crc kubenswrapper[4580]: I0321 04:54:51.617343 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:51 crc kubenswrapper[4580]: E0321 04:54:51.617904 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:51 crc kubenswrapper[4580]: I0321 04:54:51.617377 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:51 crc kubenswrapper[4580]: E0321 04:54:51.617998 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:51 crc kubenswrapper[4580]: I0321 04:54:51.617344 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:51 crc kubenswrapper[4580]: I0321 04:54:51.617716 4580 scope.go:117] "RemoveContainer" containerID="593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba" Mar 21 04:54:51 crc kubenswrapper[4580]: E0321 04:54:51.618073 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:51 crc kubenswrapper[4580]: E0321 04:54:51.618216 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.667570 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.669488 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84"} Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.669913 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.692552 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.705944 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.719627 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.741094 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"298538c0-bbe0-440c-a0a0-d166812f1ba6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25e55cf6b1c2feb181c0c5139f51f295f900960f5d429a616ebe01bed366c67c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c903f61cf0e6b078fa36787397aa50d7244f2f58db68e6eaab165f0b44dacd7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d40733302e6914f45c270751c177d3b83a20ff24a950b73a289fcf7abadd3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c8717bf5a92472b2134747e609ccf93c68e68a2021d51be80e4ca5aae8042e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a719c23dcc7675e21e14902c6fe2ef4c18bc715a610599abbe9f2b775da9c25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.754246 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e5c1013-1954-43fa-9340-02f5bc2176b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e34cce0b8b2645887d3fb044b916f5c1b348ed6ac52ff3d47b8c926223993d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.773315 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"flector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.154561 7061 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.154599 7061 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.155026 7061 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.164661 7061 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 04:54:46.164717 7061 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:54:46.164760 7061 factory.go:656] Stopping watch factory\\\\nI0321 04:54:46.164806 7061 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:54:46.164816 7061 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:54:46.175857 7061 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0321 04:54:46.175901 7061 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0321 04:54:46.175974 7061 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:54:46.176003 7061 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:54:46.176112 7061 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:54:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.785472 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.802250 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.818222 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.833364 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.849250 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.866639 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.882683 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.899263 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.913699 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c6c6747eb760f0735d1d9a95c1a7a436737adc6bcf6c2f4ecae6e770b8f6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:33Z\\\",\\\"message\\\":\\\"2026-03-21T04:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654\\\\n2026-03-21T04:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654 to /host/opt/cni/bin/\\\\n2026-03-21T04:53:48Z [verbose] multus-daemon started\\\\n2026-03-21T04:53:48Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:54:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.931259 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.947733 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:52 crc kubenswrapper[4580]: I0321 04:54:52.963192 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:54:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:53 crc kubenswrapper[4580]: I0321 04:54:53.618014 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:53 crc kubenswrapper[4580]: I0321 04:54:53.618145 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:53 crc kubenswrapper[4580]: I0321 04:54:53.618013 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:53 crc kubenswrapper[4580]: E0321 04:54:53.618185 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:53 crc kubenswrapper[4580]: I0321 04:54:53.618227 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:53 crc kubenswrapper[4580]: E0321 04:54:53.618308 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:53 crc kubenswrapper[4580]: E0321 04:54:53.618388 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:53 crc kubenswrapper[4580]: E0321 04:54:53.618791 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:53 crc kubenswrapper[4580]: I0321 04:54:53.637805 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.617307 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:55 crc kubenswrapper[4580]: E0321 04:54:55.618096 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.617361 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:55 crc kubenswrapper[4580]: E0321 04:54:55.618197 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.617603 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:55 crc kubenswrapper[4580]: E0321 04:54:55.618280 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.617353 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:55 crc kubenswrapper[4580]: E0321 04:54:55.618343 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.635056 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.655360 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"298538c0-bbe0-440c-a0a0-d166812f1ba6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25e55cf6b1c2feb181c0c5139f51f295f900960f5d429a616ebe01bed366c67c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c903f61cf0e6b078fa36787397aa50d7244f2f58db68e6eaab165f0b44dacd7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d40733302e6914f45c270751c177d3b83a20ff24a950b73a289fcf7abadd3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c8717bf5a92472b2134747e609ccf93c68e68a2021d51be80e4ca5aae8042e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a719c23dcc7675e21e14902c6fe2ef4c18bc715a610599abbe9f2b775da9c25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.669298 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e5c1013-1954-43fa-9340-02f5bc2176b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e34cce0b8b2645887d3fb044b916f5c1b348ed6ac52ff3d47b8c926223993d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.683981 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.696540 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.709328 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: E0321 04:54:55.722836 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.725029 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.751134 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"flector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.154561 7061 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.154599 7061 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.155026 7061 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.164661 7061 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 04:54:46.164717 7061 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:54:46.164760 7061 factory.go:656] Stopping watch factory\\\\nI0321 04:54:46.164806 7061 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:54:46.164816 7061 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:54:46.175857 7061 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0321 04:54:46.175901 7061 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0321 04:54:46.175974 7061 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:54:46.176003 7061 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:54:46.176112 7061 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:54:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.762741 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.775638 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.788621 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.800485 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.814200 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.831437 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.842294 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.857475 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:54:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.870256 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a470c4bd-de9d-4055-a2d6-3bae9a7187b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40eb04b778cb587d9b7938ac13c10070df9e92b5370ca3003492e52c5716022e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d453904d9212dd1ff8de7a05c8a5922e5cf807595bbcb3742a2488190c557d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbfc04456bbc9d1cfbebfdcc34be31c680afe3a7b210a663cbb214c4f929624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66879439fc53813893d44ee5736e7248e2cf88eed3dfa7198829dd536598f8af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66879439fc53813893d44ee5736e7248e2cf88eed3dfa7198829dd536598f8af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.884853 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:55 crc kubenswrapper[4580]: I0321 04:54:55.901037 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c6c6747eb760f0735d1d9a95c1a7a436737adc6bcf6c2f4ecae6e770b8f6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:33Z\\\",\\\"message\\\":\\\"2026-03-21T04:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654\\\\n2026-03-21T04:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654 to /host/opt/cni/bin/\\\\n2026-03-21T04:53:48Z [verbose] multus-daemon started\\\\n2026-03-21T04:53:48Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:54:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.555854 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.555951 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.555990 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.556016 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.556030 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:56Z","lastTransitionTime":"2026-03-21T04:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:56 crc kubenswrapper[4580]: E0321 04:54:56.570246 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.575208 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.575271 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.575285 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.575305 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.575319 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:56Z","lastTransitionTime":"2026-03-21T04:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:56 crc kubenswrapper[4580]: E0321 04:54:56.594197 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.598423 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.598478 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.598488 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.598507 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.598519 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:56Z","lastTransitionTime":"2026-03-21T04:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:56 crc kubenswrapper[4580]: E0321 04:54:56.610883 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.614858 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.614888 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.614897 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.614918 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.614930 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:56Z","lastTransitionTime":"2026-03-21T04:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:56 crc kubenswrapper[4580]: E0321 04:54:56.626942 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.631869 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.631908 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.631917 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.631935 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:54:56 crc kubenswrapper[4580]: I0321 04:54:56.631947 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:54:56Z","lastTransitionTime":"2026-03-21T04:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:54:56 crc kubenswrapper[4580]: E0321 04:54:56.645105 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:54:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:54:56 crc kubenswrapper[4580]: E0321 04:54:56.645292 4580 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:54:57 crc kubenswrapper[4580]: I0321 04:54:57.617381 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:57 crc kubenswrapper[4580]: I0321 04:54:57.617471 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:57 crc kubenswrapper[4580]: E0321 04:54:57.617587 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:54:57 crc kubenswrapper[4580]: E0321 04:54:57.617772 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:57 crc kubenswrapper[4580]: I0321 04:54:57.617928 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:57 crc kubenswrapper[4580]: E0321 04:54:57.618003 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:57 crc kubenswrapper[4580]: I0321 04:54:57.618137 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:57 crc kubenswrapper[4580]: E0321 04:54:57.618360 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:59 crc kubenswrapper[4580]: I0321 04:54:59.617296 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:54:59 crc kubenswrapper[4580]: I0321 04:54:59.617336 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:54:59 crc kubenswrapper[4580]: E0321 04:54:59.617537 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:54:59 crc kubenswrapper[4580]: I0321 04:54:59.617336 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:54:59 crc kubenswrapper[4580]: I0321 04:54:59.617339 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:54:59 crc kubenswrapper[4580]: E0321 04:54:59.617700 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:54:59 crc kubenswrapper[4580]: E0321 04:54:59.617818 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:54:59 crc kubenswrapper[4580]: E0321 04:54:59.617916 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:00 crc kubenswrapper[4580]: E0321 04:55:00.723884 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:55:01 crc kubenswrapper[4580]: I0321 04:55:01.617165 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:01 crc kubenswrapper[4580]: I0321 04:55:01.617171 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:01 crc kubenswrapper[4580]: I0321 04:55:01.617208 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:01 crc kubenswrapper[4580]: E0321 04:55:01.617546 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:01 crc kubenswrapper[4580]: E0321 04:55:01.617347 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:01 crc kubenswrapper[4580]: E0321 04:55:01.617622 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:01 crc kubenswrapper[4580]: I0321 04:55:01.617221 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:01 crc kubenswrapper[4580]: E0321 04:55:01.617735 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:02 crc kubenswrapper[4580]: I0321 04:55:02.618492 4580 scope.go:117] "RemoveContainer" containerID="b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d" Mar 21 04:55:02 crc kubenswrapper[4580]: E0321 04:55:02.618669 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" Mar 21 04:55:03 crc kubenswrapper[4580]: I0321 04:55:03.617435 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:03 crc kubenswrapper[4580]: I0321 04:55:03.617510 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:03 crc kubenswrapper[4580]: E0321 04:55:03.617572 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:03 crc kubenswrapper[4580]: E0321 04:55:03.617656 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:03 crc kubenswrapper[4580]: I0321 04:55:03.618023 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:03 crc kubenswrapper[4580]: I0321 04:55:03.618023 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:03 crc kubenswrapper[4580]: E0321 04:55:03.618078 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:03 crc kubenswrapper[4580]: E0321 04:55:03.619017 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:04 crc kubenswrapper[4580]: I0321 04:55:04.815296 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:55:04 crc kubenswrapper[4580]: I0321 04:55:04.833480 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:04Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:04 crc kubenswrapper[4580]: I0321 04:55:04.854379 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:04Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:04 crc kubenswrapper[4580]: I0321 04:55:04.888882 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"298538c0-bbe0-440c-a0a0-d166812f1ba6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25e55cf6b1c2feb181c0c5139f51f295f900960f5d429a616ebe01bed366c67c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c903f61cf0e6b078fa36787397aa50d7244f2f58db68e6eaab165f0b44dacd7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d40733302e6914f45c270751c177d3b83a20ff24a950b73a289fcf7abadd3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c8717bf5a92472b2134747e609ccf93c68e68a2021d51be80e4ca5aae8042e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a719c23dcc7675e21e14902c6fe2ef4c18bc715a610599abbe9f2b775da9c25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:04Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:04 crc kubenswrapper[4580]: I0321 04:55:04.903550 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e5c1013-1954-43fa-9340-02f5bc2176b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e34cce0b8b2645887d3fb044b916f5c1b348ed6ac52ff3d47b8c926223993d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:04Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:04 crc kubenswrapper[4580]: I0321 04:55:04.924860 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:04Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:04 crc kubenswrapper[4580]: I0321 04:55:04.939211 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:04Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:04 crc kubenswrapper[4580]: I0321 04:55:04.955019 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:04Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:04 crc kubenswrapper[4580]: I0321 04:55:04.968298 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:04Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:04 crc kubenswrapper[4580]: I0321 04:55:04.990184 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"flector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.154561 7061 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.154599 7061 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.155026 7061 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.164661 7061 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 04:54:46.164717 7061 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:54:46.164760 7061 factory.go:656] Stopping watch factory\\\\nI0321 04:54:46.164806 7061 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:54:46.164816 7061 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:54:46.175857 7061 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0321 04:54:46.175901 7061 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0321 04:54:46.175974 7061 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:54:46.176003 7061 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:54:46.176112 7061 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:54:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:04Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.004657 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.018654 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.034269 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.044071 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.056253 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c6c6747eb760f0735d1d9a95c1a7a436737adc6bcf6c2f4ecae6e770b8f6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:33Z\\\",\\\"message\\\":\\\"2026-03-21T04:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654\\\\n2026-03-21T04:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654 to /host/opt/cni/bin/\\\\n2026-03-21T04:53:48Z [verbose] multus-daemon started\\\\n2026-03-21T04:53:48Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:54:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.077064 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.088259 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.101847 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":5,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:54:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.114373 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a470c4bd-de9d-4055-a2d6-3bae9a7187b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40eb04b778cb587d9b7938ac13c10070df9e92b5370ca3003492e52c5716022e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d453904d9212dd1ff8de7a05c8a5922e5cf807595bbcb3742a2488190c557d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbfc04456bbc9d1cfbebfdcc34be31c680afe3a7b210a663cbb214c4f929624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66879439fc53813893d44ee5736e7248e2cf88eed3dfa7198829dd536598f8af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66879439fc53813893d44ee5736e7248e2cf88eed3dfa7198829dd536598f8af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.128363 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.617612 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.617699 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:05 crc kubenswrapper[4580]: E0321 04:55:05.617730 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.617759 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.617825 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:05 crc kubenswrapper[4580]: E0321 04:55:05.617935 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:05 crc kubenswrapper[4580]: E0321 04:55:05.618076 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:05 crc kubenswrapper[4580]: E0321 04:55:05.618151 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.635221 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.652772 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.676496 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b33648e-09ea-47e5-a32d-8bc5f0209e92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:46Z\\\",\\\"message\\\":\\\"flector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.154561 7061 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.154599 7061 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.155026 7061 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:54:46.164661 7061 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 04:54:46.164717 7061 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:54:46.164760 7061 factory.go:656] Stopping watch factory\\\\nI0321 04:54:46.164806 7061 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:54:46.164816 7061 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:54:46.175857 7061 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0321 04:54:46.175901 7061 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0321 04:54:46.175974 7061 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:54:46.176003 7061 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:54:46.176112 7061 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:54:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-shb8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2pzl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.689219 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j7s9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab49cffe-6918-451d-bbf0-8933c7395982\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60c95d01e779512b2e9e442adf15c285596543235314e95b9fb09772256eb201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j7s9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.706877 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1c2d4f3-6f26-4ac5-a7f7-e748fbfb0d41\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90717d29546ff405d0a575aa3d8e07198dff5aa42a94037148ef25657c80ab14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c249fec74ba01ef95551012a54712c2fbb1249e2ec94743210259bb8eee3c68\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:52:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:51:39.379825 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:51:39.384803 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:51:39.443227 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:51:39.449493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:52:03.350260 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:52:03.350456 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:52:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://096070e8e683ea8bac4a6e66ffbb2593bf37ee812229751fb9bef2ff1bc18f32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ebc2981e9de8a1706d4c39a3e53fd3822c904e9d820f3e5b82f23d3d012d429\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.721155 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41dca32dcd316141066ad136f7ab1e390bd4577190c828090f6a7ae73bf55c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: E0321 04:55:05.724865 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.733858 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sjrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fpb6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.749864 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39312d7d-2530-4274-a347-e32998996270\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921c06d6e877d9e93e2c3712bdb676190492f40db17a209d4ef87cf23b85aecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44b4dcc2658d04c738fc83c2a6121c99462509af5bf3d70f8771f25b77fbd80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmqbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mh4jz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.764164 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk68q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0217ec0-8db1-4e76-bda0-e6299469b59c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7749f88810dc68f03a40d10b1e9cca244ba3866f073d7c1cd86e0ddee50b70aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b417af937b3edd712598f4013d81d1d2ec3247033e0e373892dbaec629e33775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98cd0bde0d2dab02f0824f97f50c1f823dfa419f0baa16c24da05344a89424fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e34fa34d91a488f5085ef208688ab4fb05c45b676894ebaa9e8ee6e376c50c9d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab360665258da86f7349488a70e7d9bd4262772c2ff3a45366ff4a564f7444b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f101abe848c1fdef7a257a3707ce949c6f4ea38867a9a69860aa988e035109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa4e21de97633c0fb7bedcbb3b547f556db5205f9ab8480e13fc46e1c1c1ac4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:53:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pc78n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk68q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.776207 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhc9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c44c4ee-33a3-482b-b409-8ad89483790d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4aa985cad4aebf1213ec88d4042867b77fffbb54a0485e3d9e5b7b412ff650d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xphxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhc9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.790274 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"390e4700-584c-4822-a638-08a1e97f37e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:53:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:53:26.200716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:53:26.200876 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:53:26.201556 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1262746501/tls.crt::/tmp/serving-cert-1262746501/tls.key\\\\\\\"\\\\nI0321 04:53:26.437606 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:53:26.439463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:53:26.439488 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:53:26.439513 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:53:26.439519 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:53:26.449634 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 04:53:26.449662 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:53:26.449676 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449684 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:53:26.449689 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 04:53:26.449693 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:53:26.449697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:53:26.449699 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:53:26.452993 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":5,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:54:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.805801 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a470c4bd-de9d-4055-a2d6-3bae9a7187b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:52:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40eb04b778cb587d9b7938ac13c10070df9e92b5370ca3003492e52c5716022e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d453904d9212dd1ff8de7a05c8a5922e5cf807595bbcb3742a2488190c557d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbfc04456bbc9d1cfbebfdcc34be31c680afe3a7b210a663cbb214c4f929624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66879439fc53813893d44ee5736e7248e2cf88eed3dfa7198829dd536598f8af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66879439fc53813893d44ee5736e7248e2cf88eed3dfa7198829dd536598f8af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.821554 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaab59f2a051e3f7df436198b41f389847ab75e0883b190b46a62a37b6281060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d9f4cb20516a1fa9bf40d34a0b18d129cc20ce9f15f0f46a9a6c05965d5429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.838728 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5bcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6761e28-8a0c-4ea2-b248-2bd60e3862e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54c6c6747eb760f0735d1d9a95c1a7a436737adc6bcf6c2f4ecae6e770b8f6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:54:33Z\\\",\\\"message\\\":\\\"2026-03-21T04:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654\\\\n2026-03-21T04:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d5ce33-9a8e-4cb1-b5f6-4ee82b181654 to /host/opt/cni/bin/\\\\n2026-03-21T04:53:48Z [verbose] multus-daemon started\\\\n2026-03-21T04:53:48Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:54:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7s5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5bcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.855264 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.876872 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"298538c0-bbe0-440c-a0a0-d166812f1ba6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25e55cf6b1c2feb181c0c5139f51f295f900960f5d429a616ebe01bed366c67c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c903f61cf0e6b078fa36787397aa50d7244f2f58db68e6eaab165f0b44dacd7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d40733302e6914f45c270751c177d3b83a20ff24a950b73a289fcf7abadd3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c8717bf5a92472b2134747e609ccf93c68e68a2021d51be80e4ca5aae8042e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a719c23dcc7675e21e14902c6fe2ef4c18bc715a610599abbe9f2b775da9c25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a8270e80bc5434e9f641c9521afb753eab94508edd9fe219b177cb91fc1c6f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7468df1b5aa359767ba5fdd7ae056d8df067d4a5f6deb76ca3b0e7b3df039a46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6ece02d1cb59d453511a6c387a60df5c1cb0f03c66b093514ca6d7ae2ad7a5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.889396 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e5c1013-1954-43fa-9340-02f5bc2176b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e34cce0b8b2645887d3fb044b916f5c1b348ed6ac52ff3d47b8c926223993d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ab2151ebb569046157291b66d79512d644e803eaf833c3a27328fdc0da4a41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:51:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:51:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.902026 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9668dcb-27e6-469d-aa01-da4dc9cf6664\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510c9dbc3733bd8b35f21a152939beabb43c16cac3ba4ec851cf0efc4b76794f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:53:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7w8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:05 crc kubenswrapper[4580]: I0321 04:55:05.915615 4580 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:53:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559f41d7b0b17f12d91837128458a9c5290177fb4aae2a9ff646532f86d9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.717472 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.717952 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.717970 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.717989 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.718006 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:55:06Z","lastTransitionTime":"2026-03-21T04:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:55:06 crc kubenswrapper[4580]: E0321 04:55:06.733916 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.737992 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.738065 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.738086 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.738116 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.738140 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:55:06Z","lastTransitionTime":"2026-03-21T04:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:55:06 crc kubenswrapper[4580]: E0321 04:55:06.755290 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.763290 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.763328 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.763374 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.763389 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.763399 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:55:06Z","lastTransitionTime":"2026-03-21T04:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:55:06 crc kubenswrapper[4580]: E0321 04:55:06.779153 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.783140 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.783181 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.783193 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.783209 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.783218 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:55:06Z","lastTransitionTime":"2026-03-21T04:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:55:06 crc kubenswrapper[4580]: E0321 04:55:06.796704 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.800691 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.800730 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.800747 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.800767 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:55:06 crc kubenswrapper[4580]: I0321 04:55:06.800803 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:55:06Z","lastTransitionTime":"2026-03-21T04:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:55:06 crc kubenswrapper[4580]: E0321 04:55:06.814955 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:55:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b0468f0-788e-4966-a835-4b5e60e90122\\\",\\\"systemUUID\\\":\\\"30104a4f-3cbf-4278-a817-16cb78d9b6b0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:55:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:55:06 crc kubenswrapper[4580]: E0321 04:55:06.815135 4580 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:55:07 crc kubenswrapper[4580]: I0321 04:55:07.617441 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:07 crc kubenswrapper[4580]: I0321 04:55:07.617520 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:07 crc kubenswrapper[4580]: I0321 04:55:07.617457 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:07 crc kubenswrapper[4580]: E0321 04:55:07.617635 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:07 crc kubenswrapper[4580]: I0321 04:55:07.617655 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:07 crc kubenswrapper[4580]: E0321 04:55:07.617804 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:07 crc kubenswrapper[4580]: E0321 04:55:07.617926 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:07 crc kubenswrapper[4580]: E0321 04:55:07.618049 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:09 crc kubenswrapper[4580]: I0321 04:55:09.616752 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:09 crc kubenswrapper[4580]: I0321 04:55:09.616814 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:09 crc kubenswrapper[4580]: E0321 04:55:09.616922 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:09 crc kubenswrapper[4580]: I0321 04:55:09.616969 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:09 crc kubenswrapper[4580]: I0321 04:55:09.616978 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:09 crc kubenswrapper[4580]: E0321 04:55:09.617047 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:09 crc kubenswrapper[4580]: E0321 04:55:09.617166 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:09 crc kubenswrapper[4580]: E0321 04:55:09.617401 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:10 crc kubenswrapper[4580]: E0321 04:55:10.726594 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:55:11 crc kubenswrapper[4580]: I0321 04:55:11.617075 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:11 crc kubenswrapper[4580]: E0321 04:55:11.617265 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:11 crc kubenswrapper[4580]: I0321 04:55:11.617486 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:11 crc kubenswrapper[4580]: E0321 04:55:11.617540 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:11 crc kubenswrapper[4580]: I0321 04:55:11.617650 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:11 crc kubenswrapper[4580]: E0321 04:55:11.617696 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:11 crc kubenswrapper[4580]: I0321 04:55:11.617957 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:11 crc kubenswrapper[4580]: E0321 04:55:11.618028 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:13 crc kubenswrapper[4580]: I0321 04:55:13.617280 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:13 crc kubenswrapper[4580]: E0321 04:55:13.617440 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:13 crc kubenswrapper[4580]: I0321 04:55:13.617296 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:13 crc kubenswrapper[4580]: I0321 04:55:13.617277 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:13 crc kubenswrapper[4580]: I0321 04:55:13.617562 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:13 crc kubenswrapper[4580]: E0321 04:55:13.617530 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:13 crc kubenswrapper[4580]: E0321 04:55:13.617680 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:13 crc kubenswrapper[4580]: E0321 04:55:13.617814 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.616891 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.616948 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.617003 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:15 crc kubenswrapper[4580]: E0321 04:55:15.617085 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.617114 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:15 crc kubenswrapper[4580]: E0321 04:55:15.617459 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:15 crc kubenswrapper[4580]: E0321 04:55:15.617513 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:15 crc kubenswrapper[4580]: E0321 04:55:15.617581 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.617859 4580 scope.go:117] "RemoveContainer" containerID="b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d" Mar 21 04:55:15 crc kubenswrapper[4580]: E0321 04:55:15.618043 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.703594 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-j7s9f" podStartSLOduration=160.703576865 podStartE2EDuration="2m40.703576865s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:15.703443972 +0000 UTC m=+220.786027610" watchObservedRunningTime="2026-03-21 04:55:15.703576865 +0000 UTC m=+220.786160493" Mar 21 04:55:15 crc kubenswrapper[4580]: E0321 04:55:15.728187 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.731345 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=89.731323648 podStartE2EDuration="1m29.731323648s" podCreationTimestamp="2026-03-21 04:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:15.717711268 +0000 UTC m=+220.800294906" watchObservedRunningTime="2026-03-21 04:55:15.731323648 +0000 UTC m=+220.813907286" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.765368 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mh4jz" podStartSLOduration=159.765350247 podStartE2EDuration="2m39.765350247s" podCreationTimestamp="2026-03-21 04:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:15.761664745 +0000 UTC m=+220.844248383" watchObservedRunningTime="2026-03-21 04:55:15.765350247 +0000 UTC m=+220.847933875" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.782512 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=87.782491984 podStartE2EDuration="1m27.782491984s" podCreationTimestamp="2026-03-21 04:53:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:15.782148066 +0000 UTC m=+220.864731724" watchObservedRunningTime="2026-03-21 04:55:15.782491984 +0000 UTC m=+220.865075612" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.798221 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=22.798202816 podStartE2EDuration="22.798202816s" podCreationTimestamp="2026-03-21 04:54:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:15.798125884 +0000 UTC m=+220.880709532" watchObservedRunningTime="2026-03-21 04:55:15.798202816 +0000 UTC m=+220.880786444" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.828095 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-z5bcs" podStartSLOduration=160.828077532 podStartE2EDuration="2m40.828077532s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:15.82761046 +0000 UTC m=+220.910194128" watchObservedRunningTime="2026-03-21 04:55:15.828077532 +0000 UTC m=+220.910661160" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.849009 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gk68q" podStartSLOduration=160.848990254 podStartE2EDuration="2m40.848990254s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:15.84845221 +0000 UTC m=+220.931035858" watchObservedRunningTime="2026-03-21 04:55:15.848990254 +0000 UTC m=+220.931573882" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.862282 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qhc9t" podStartSLOduration=160.862258855 podStartE2EDuration="2m40.862258855s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:15.861401733 +0000 UTC m=+220.943985381" watchObservedRunningTime="2026-03-21 04:55:15.862258855 +0000 UTC m=+220.944842493" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.901634 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=47.901609156 podStartE2EDuration="47.901609156s" podCreationTimestamp="2026-03-21 04:54:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:15.88932452 +0000 UTC m=+220.971908168" watchObservedRunningTime="2026-03-21 04:55:15.901609156 +0000 UTC m=+220.984192804" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.902055 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=62.902048397 podStartE2EDuration="1m2.902048397s" podCreationTimestamp="2026-03-21 04:54:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:15.901867863 +0000 UTC m=+220.984451511" watchObservedRunningTime="2026-03-21 04:55:15.902048397 +0000 UTC m=+220.984632025" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.933271 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podStartSLOduration=160.933253626 podStartE2EDuration="2m40.933253626s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:15.917663127 +0000 UTC m=+221.000246775" watchObservedRunningTime="2026-03-21 04:55:15.933253626 +0000 UTC m=+221.015837264" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.972185 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:55:15 crc kubenswrapper[4580]: I0321 04:55:15.973059 4580 scope.go:117] "RemoveContainer" containerID="b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d" Mar 21 04:55:15 crc kubenswrapper[4580]: E0321 04:55:15.973232 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2pzl9_openshift-ovn-kubernetes(2b33648e-09ea-47e5-a32d-8bc5f0209e92)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.135525 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.135574 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.135589 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.135609 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.135624 4580 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:55:17Z","lastTransitionTime":"2026-03-21T04:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.182108 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n"] Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.182544 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.184099 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.185519 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.185766 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.185825 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.257375 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/511ba382-ac14-43b9-8302-a2e9feafa332-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pwv6n\" (UID: \"511ba382-ac14-43b9-8302-a2e9feafa332\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.257428 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/511ba382-ac14-43b9-8302-a2e9feafa332-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pwv6n\" (UID: \"511ba382-ac14-43b9-8302-a2e9feafa332\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.257476 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/511ba382-ac14-43b9-8302-a2e9feafa332-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pwv6n\" (UID: \"511ba382-ac14-43b9-8302-a2e9feafa332\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.257530 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/511ba382-ac14-43b9-8302-a2e9feafa332-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pwv6n\" (UID: \"511ba382-ac14-43b9-8302-a2e9feafa332\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.257590 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/511ba382-ac14-43b9-8302-a2e9feafa332-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pwv6n\" (UID: \"511ba382-ac14-43b9-8302-a2e9feafa332\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.358961 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/511ba382-ac14-43b9-8302-a2e9feafa332-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pwv6n\" (UID: \"511ba382-ac14-43b9-8302-a2e9feafa332\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.359023 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/511ba382-ac14-43b9-8302-a2e9feafa332-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pwv6n\" (UID: \"511ba382-ac14-43b9-8302-a2e9feafa332\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.359051 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/511ba382-ac14-43b9-8302-a2e9feafa332-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pwv6n\" (UID: \"511ba382-ac14-43b9-8302-a2e9feafa332\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.359096 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/511ba382-ac14-43b9-8302-a2e9feafa332-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pwv6n\" (UID: \"511ba382-ac14-43b9-8302-a2e9feafa332\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.359159 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/511ba382-ac14-43b9-8302-a2e9feafa332-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pwv6n\" (UID: \"511ba382-ac14-43b9-8302-a2e9feafa332\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.359161 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/511ba382-ac14-43b9-8302-a2e9feafa332-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pwv6n\" (UID: \"511ba382-ac14-43b9-8302-a2e9feafa332\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.359205 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/511ba382-ac14-43b9-8302-a2e9feafa332-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pwv6n\" (UID: \"511ba382-ac14-43b9-8302-a2e9feafa332\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.360448 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/511ba382-ac14-43b9-8302-a2e9feafa332-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pwv6n\" (UID: \"511ba382-ac14-43b9-8302-a2e9feafa332\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.366596 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/511ba382-ac14-43b9-8302-a2e9feafa332-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pwv6n\" (UID: \"511ba382-ac14-43b9-8302-a2e9feafa332\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.378273 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/511ba382-ac14-43b9-8302-a2e9feafa332-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pwv6n\" (UID: \"511ba382-ac14-43b9-8302-a2e9feafa332\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.495327 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.619529 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:17 crc kubenswrapper[4580]: E0321 04:55:17.619643 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.619834 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:17 crc kubenswrapper[4580]: E0321 04:55:17.619889 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.620007 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:17 crc kubenswrapper[4580]: E0321 04:55:17.620058 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.620240 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:17 crc kubenswrapper[4580]: E0321 04:55:17.620290 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.752576 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" event={"ID":"511ba382-ac14-43b9-8302-a2e9feafa332","Type":"ContainerStarted","Data":"7a5923a7c68faf95daf0e13f2687968858754543f994cac4aec970a8dcffded8"} Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.752632 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" event={"ID":"511ba382-ac14-43b9-8302-a2e9feafa332","Type":"ContainerStarted","Data":"f3e898f944b4b2908dee9cd0c6b6889469bad6fe3f6c9209642f5849d1d62434"} Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.841813 4580 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 21 04:55:17 crc kubenswrapper[4580]: I0321 04:55:17.853600 4580 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 21 04:55:19 crc kubenswrapper[4580]: I0321 04:55:19.617713 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:19 crc kubenswrapper[4580]: I0321 04:55:19.617713 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:19 crc kubenswrapper[4580]: I0321 04:55:19.617728 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:19 crc kubenswrapper[4580]: I0321 04:55:19.617827 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:19 crc kubenswrapper[4580]: E0321 04:55:19.618127 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:19 crc kubenswrapper[4580]: E0321 04:55:19.618199 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:19 crc kubenswrapper[4580]: E0321 04:55:19.618044 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:19 crc kubenswrapper[4580]: E0321 04:55:19.618449 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:19 crc kubenswrapper[4580]: I0321 04:55:19.761700 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z5bcs_f6761e28-8a0c-4ea2-b248-2bd60e3862e6/kube-multus/1.log" Mar 21 04:55:19 crc kubenswrapper[4580]: I0321 04:55:19.762127 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z5bcs_f6761e28-8a0c-4ea2-b248-2bd60e3862e6/kube-multus/0.log" Mar 21 04:55:19 crc kubenswrapper[4580]: I0321 04:55:19.762197 4580 generic.go:334] "Generic (PLEG): container finished" podID="f6761e28-8a0c-4ea2-b248-2bd60e3862e6" containerID="54c6c6747eb760f0735d1d9a95c1a7a436737adc6bcf6c2f4ecae6e770b8f6b8" exitCode=1 Mar 21 04:55:19 crc kubenswrapper[4580]: I0321 04:55:19.762239 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z5bcs" event={"ID":"f6761e28-8a0c-4ea2-b248-2bd60e3862e6","Type":"ContainerDied","Data":"54c6c6747eb760f0735d1d9a95c1a7a436737adc6bcf6c2f4ecae6e770b8f6b8"} Mar 21 04:55:19 crc kubenswrapper[4580]: I0321 04:55:19.762293 4580 scope.go:117] "RemoveContainer" containerID="2e0ad17a8b3dc20e556a8b8388632ca04c55dcc84b63bec04de1d0d7426d1c57" Mar 21 04:55:19 crc kubenswrapper[4580]: I0321 04:55:19.763274 4580 scope.go:117] "RemoveContainer" containerID="54c6c6747eb760f0735d1d9a95c1a7a436737adc6bcf6c2f4ecae6e770b8f6b8" Mar 21 04:55:19 crc kubenswrapper[4580]: E0321 04:55:19.763561 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-z5bcs_openshift-multus(f6761e28-8a0c-4ea2-b248-2bd60e3862e6)\"" pod="openshift-multus/multus-z5bcs" podUID="f6761e28-8a0c-4ea2-b248-2bd60e3862e6" Mar 21 04:55:19 crc kubenswrapper[4580]: I0321 04:55:19.797019 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pwv6n" podStartSLOduration=164.79698611 podStartE2EDuration="2m44.79698611s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:17.769496682 +0000 UTC m=+222.852080320" watchObservedRunningTime="2026-03-21 04:55:19.79698611 +0000 UTC m=+224.879569748" Mar 21 04:55:20 crc kubenswrapper[4580]: E0321 04:55:20.730237 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:55:20 crc kubenswrapper[4580]: I0321 04:55:20.768273 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z5bcs_f6761e28-8a0c-4ea2-b248-2bd60e3862e6/kube-multus/1.log" Mar 21 04:55:21 crc kubenswrapper[4580]: I0321 04:55:21.617202 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:21 crc kubenswrapper[4580]: I0321 04:55:21.617218 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:21 crc kubenswrapper[4580]: E0321 04:55:21.617739 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:21 crc kubenswrapper[4580]: I0321 04:55:21.617364 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:21 crc kubenswrapper[4580]: E0321 04:55:21.617829 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:21 crc kubenswrapper[4580]: I0321 04:55:21.617254 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:21 crc kubenswrapper[4580]: E0321 04:55:21.617882 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:21 crc kubenswrapper[4580]: E0321 04:55:21.617947 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:23 crc kubenswrapper[4580]: I0321 04:55:23.617026 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:23 crc kubenswrapper[4580]: I0321 04:55:23.617084 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:23 crc kubenswrapper[4580]: I0321 04:55:23.617196 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:23 crc kubenswrapper[4580]: E0321 04:55:23.617964 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:23 crc kubenswrapper[4580]: E0321 04:55:23.618357 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:23 crc kubenswrapper[4580]: E0321 04:55:23.618484 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:23 crc kubenswrapper[4580]: I0321 04:55:23.618623 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:23 crc kubenswrapper[4580]: E0321 04:55:23.618853 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:25 crc kubenswrapper[4580]: I0321 04:55:25.616917 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:25 crc kubenswrapper[4580]: I0321 04:55:25.616917 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:25 crc kubenswrapper[4580]: I0321 04:55:25.616922 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:25 crc kubenswrapper[4580]: I0321 04:55:25.618490 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:25 crc kubenswrapper[4580]: E0321 04:55:25.618483 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:25 crc kubenswrapper[4580]: E0321 04:55:25.618629 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:25 crc kubenswrapper[4580]: E0321 04:55:25.618824 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:25 crc kubenswrapper[4580]: E0321 04:55:25.618876 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:25 crc kubenswrapper[4580]: E0321 04:55:25.732295 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:55:27 crc kubenswrapper[4580]: I0321 04:55:27.618154 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:27 crc kubenswrapper[4580]: I0321 04:55:27.618247 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:27 crc kubenswrapper[4580]: I0321 04:55:27.618168 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:27 crc kubenswrapper[4580]: E0321 04:55:27.618382 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:27 crc kubenswrapper[4580]: I0321 04:55:27.618255 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:27 crc kubenswrapper[4580]: E0321 04:55:27.618518 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:27 crc kubenswrapper[4580]: E0321 04:55:27.618630 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:27 crc kubenswrapper[4580]: E0321 04:55:27.618703 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:29 crc kubenswrapper[4580]: I0321 04:55:29.617501 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:29 crc kubenswrapper[4580]: I0321 04:55:29.617531 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:29 crc kubenswrapper[4580]: I0321 04:55:29.617820 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:29 crc kubenswrapper[4580]: I0321 04:55:29.617829 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:29 crc kubenswrapper[4580]: E0321 04:55:29.617950 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:29 crc kubenswrapper[4580]: E0321 04:55:29.618297 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:29 crc kubenswrapper[4580]: I0321 04:55:29.618348 4580 scope.go:117] "RemoveContainer" containerID="b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d" Mar 21 04:55:29 crc kubenswrapper[4580]: E0321 04:55:29.618421 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:29 crc kubenswrapper[4580]: E0321 04:55:29.618554 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:29 crc kubenswrapper[4580]: I0321 04:55:29.808947 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovnkube-controller/3.log" Mar 21 04:55:29 crc kubenswrapper[4580]: I0321 04:55:29.811891 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerStarted","Data":"f3a7b7fc8d7b086453bdbebc04f338be16fb353fd37bc5d175c1797db5d57c46"} Mar 21 04:55:29 crc kubenswrapper[4580]: I0321 04:55:29.812887 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:55:30 crc kubenswrapper[4580]: I0321 04:55:30.475676 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podStartSLOduration=175.475647317 podStartE2EDuration="2m55.475647317s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:29.854139639 +0000 UTC m=+234.936723297" watchObservedRunningTime="2026-03-21 04:55:30.475647317 +0000 UTC m=+235.558230945" Mar 21 04:55:30 crc kubenswrapper[4580]: I0321 04:55:30.476041 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fpb6h"] Mar 21 04:55:30 crc kubenswrapper[4580]: I0321 04:55:30.476182 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:30 crc kubenswrapper[4580]: E0321 04:55:30.476315 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:30 crc kubenswrapper[4580]: I0321 04:55:30.619402 4580 scope.go:117] "RemoveContainer" containerID="54c6c6747eb760f0735d1d9a95c1a7a436737adc6bcf6c2f4ecae6e770b8f6b8" Mar 21 04:55:30 crc kubenswrapper[4580]: E0321 04:55:30.734152 4580 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:55:30 crc kubenswrapper[4580]: I0321 04:55:30.817790 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z5bcs_f6761e28-8a0c-4ea2-b248-2bd60e3862e6/kube-multus/1.log" Mar 21 04:55:30 crc kubenswrapper[4580]: I0321 04:55:30.817910 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z5bcs" event={"ID":"f6761e28-8a0c-4ea2-b248-2bd60e3862e6","Type":"ContainerStarted","Data":"b9bd2bb0ffd184225a6d57bedbcd4c082b6bce0cbbac8c80394eab05be82361a"} Mar 21 04:55:31 crc kubenswrapper[4580]: I0321 04:55:31.617466 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:31 crc kubenswrapper[4580]: E0321 04:55:31.617653 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:31 crc kubenswrapper[4580]: I0321 04:55:31.617801 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:31 crc kubenswrapper[4580]: E0321 04:55:31.618036 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:31 crc kubenswrapper[4580]: I0321 04:55:31.617467 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:31 crc kubenswrapper[4580]: E0321 04:55:31.618153 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:31 crc kubenswrapper[4580]: I0321 04:55:31.617847 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:31 crc kubenswrapper[4580]: E0321 04:55:31.618249 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:33 crc kubenswrapper[4580]: I0321 04:55:33.617443 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:33 crc kubenswrapper[4580]: E0321 04:55:33.617629 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:33 crc kubenswrapper[4580]: I0321 04:55:33.617952 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:33 crc kubenswrapper[4580]: E0321 04:55:33.618015 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:33 crc kubenswrapper[4580]: I0321 04:55:33.618165 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:33 crc kubenswrapper[4580]: E0321 04:55:33.618228 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:33 crc kubenswrapper[4580]: I0321 04:55:33.618567 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:33 crc kubenswrapper[4580]: E0321 04:55:33.618649 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:35 crc kubenswrapper[4580]: I0321 04:55:35.617567 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:35 crc kubenswrapper[4580]: I0321 04:55:35.617615 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:35 crc kubenswrapper[4580]: I0321 04:55:35.617667 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:35 crc kubenswrapper[4580]: I0321 04:55:35.617793 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:35 crc kubenswrapper[4580]: E0321 04:55:35.619166 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpb6h" podUID="ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7" Mar 21 04:55:35 crc kubenswrapper[4580]: E0321 04:55:35.619292 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:55:35 crc kubenswrapper[4580]: E0321 04:55:35.619341 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:55:35 crc kubenswrapper[4580]: E0321 04:55:35.619393 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.617229 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.617229 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.617308 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.617313 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.621240 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.622052 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.622216 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.622309 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.622392 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.622490 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.779124 4580 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.820556 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xh9jk"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.821192 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.821352 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.822017 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.827300 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fshpg"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.828046 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.829852 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cvqg4"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.830543 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.830594 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.831056 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.832984 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.833568 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.845085 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.845217 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.846845 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.847727 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.847844 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.848557 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.848876 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qpsf8"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.849332 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.849508 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.849505 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.849891 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.849933 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.850225 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.850305 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.850552 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.850881 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.851018 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.851565 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.851585 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.851705 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.851871 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.851994 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.852105 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.852912 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.853040 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.853233 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.853354 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.853346 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.856533 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-48dqz"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.857141 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.857255 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.857269 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.857545 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.857565 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.857765 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.857822 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.857883 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.857894 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.857822 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.858013 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.858044 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.858135 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.858238 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.858938 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.859074 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.859228 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.864981 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.876613 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.881343 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.897828 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.898365 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vckmv"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.898829 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vckmv" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.898964 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.899371 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.899468 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.899494 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.899640 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-4ffj8"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.899501 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.900429 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4ffj8" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.901101 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kmfk5"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.901663 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.908162 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.908545 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.908590 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.910767 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da00c0bc-2ff1-4b15-be1f-8fac48921976-config\") pod \"machine-api-operator-5694c8668f-cvqg4\" (UID: \"da00c0bc-2ff1-4b15-be1f-8fac48921976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.910816 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5aabdab2-8553-4784-ac9e-8d1a42b5d32b-trusted-ca\") pod \"console-operator-58897d9998-qpsf8\" (UID: \"5aabdab2-8553-4784-ac9e-8d1a42b5d32b\") " pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.910839 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/edb0a4af-3bf8-4517-af4a-a7546e9acf87-etcd-client\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.910857 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-serving-cert\") pod \"route-controller-manager-6576b87f9c-fkfvp\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.910877 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-audit\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.910894 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/edb0a4af-3bf8-4517-af4a-a7546e9acf87-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.910909 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-client-ca\") pod \"controller-manager-879f6c89f-xh9jk\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.910926 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aabdab2-8553-4784-ac9e-8d1a42b5d32b-serving-cert\") pod \"console-operator-58897d9998-qpsf8\" (UID: \"5aabdab2-8553-4784-ac9e-8d1a42b5d32b\") " pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.910942 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnjwx\" (UniqueName: \"kubernetes.io/projected/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-kube-api-access-rnjwx\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.910957 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/da00c0bc-2ff1-4b15-be1f-8fac48921976-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cvqg4\" (UID: \"da00c0bc-2ff1-4b15-be1f-8fac48921976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.910980 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-config\") pod \"controller-manager-879f6c89f-xh9jk\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.910995 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6c99\" (UniqueName: \"kubernetes.io/projected/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-kube-api-access-p6c99\") pod \"route-controller-manager-6576b87f9c-fkfvp\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911010 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqdxh\" (UniqueName: \"kubernetes.io/projected/794862b0-a985-459a-98d4-cc82612d3593-kube-api-access-kqdxh\") pod \"controller-manager-879f6c89f-xh9jk\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911027 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edb0a4af-3bf8-4517-af4a-a7546e9acf87-serving-cert\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911044 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-encryption-config\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911063 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/edb0a4af-3bf8-4517-af4a-a7546e9acf87-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911082 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/edb0a4af-3bf8-4517-af4a-a7546e9acf87-audit-dir\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911098 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkts9\" (UniqueName: \"kubernetes.io/projected/da00c0bc-2ff1-4b15-be1f-8fac48921976-kube-api-access-gkts9\") pod \"machine-api-operator-5694c8668f-cvqg4\" (UID: \"da00c0bc-2ff1-4b15-be1f-8fac48921976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911116 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zdqm\" (UniqueName: \"kubernetes.io/projected/c9c2c4c6-575b-4b3a-91ba-f694e4005859-kube-api-access-5zdqm\") pod \"openshift-apiserver-operator-796bbdcf4f-jhb4w\" (UID: \"c9c2c4c6-575b-4b3a-91ba-f694e4005859\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911138 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aabdab2-8553-4784-ac9e-8d1a42b5d32b-config\") pod \"console-operator-58897d9998-qpsf8\" (UID: \"5aabdab2-8553-4784-ac9e-8d1a42b5d32b\") " pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911163 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-client-ca\") pod \"route-controller-manager-6576b87f9c-fkfvp\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911178 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911193 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/794862b0-a985-459a-98d4-cc82612d3593-serving-cert\") pod \"controller-manager-879f6c89f-xh9jk\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911209 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-image-import-ca\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911226 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9c2c4c6-575b-4b3a-91ba-f694e4005859-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jhb4w\" (UID: \"c9c2c4c6-575b-4b3a-91ba-f694e4005859\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911244 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-etcd-serving-ca\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911264 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-serving-cert\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911285 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptcjw\" (UniqueName: \"kubernetes.io/projected/f66c5870-b6ec-4673-8229-259391f6dada-kube-api-access-ptcjw\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwprr\" (UID: \"f66c5870-b6ec-4673-8229-259391f6dada\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911311 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/edb0a4af-3bf8-4517-af4a-a7546e9acf87-audit-policies\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911327 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8fcs\" (UniqueName: \"kubernetes.io/projected/edb0a4af-3bf8-4517-af4a-a7546e9acf87-kube-api-access-n8fcs\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911346 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/da00c0bc-2ff1-4b15-be1f-8fac48921976-images\") pod \"machine-api-operator-5694c8668f-cvqg4\" (UID: \"da00c0bc-2ff1-4b15-be1f-8fac48921976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911365 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f66c5870-b6ec-4673-8229-259391f6dada-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwprr\" (UID: \"f66c5870-b6ec-4673-8229-259391f6dada\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911380 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911427 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911577 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911383 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d8jb\" (UniqueName: \"kubernetes.io/projected/5aabdab2-8553-4784-ac9e-8d1a42b5d32b-kube-api-access-8d8jb\") pod \"console-operator-58897d9998-qpsf8\" (UID: \"5aabdab2-8553-4784-ac9e-8d1a42b5d32b\") " pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911802 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-config\") pod \"route-controller-manager-6576b87f9c-fkfvp\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911874 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-etcd-client\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911920 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911941 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9c2c4c6-575b-4b3a-91ba-f694e4005859-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jhb4w\" (UID: \"c9c2c4c6-575b-4b3a-91ba-f694e4005859\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.911986 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-audit-dir\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.912014 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xh9jk\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.912042 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/edb0a4af-3bf8-4517-af4a-a7546e9acf87-encryption-config\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.912062 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-node-pullsecrets\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.912081 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-config\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.912107 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f66c5870-b6ec-4673-8229-259391f6dada-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwprr\" (UID: \"f66c5870-b6ec-4673-8229-259391f6dada\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.912259 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.914265 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.916063 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.923964 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.924875 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.928861 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2rbc4"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.929678 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.929684 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p8tbn"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.936176 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.936436 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.936651 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.938734 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vhp42"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.939146 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.939173 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-dnnd6"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.939487 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.939758 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.940191 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.940496 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-p8tbn" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.941011 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.941411 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.941653 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.942361 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.942599 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.942897 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.943224 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.948339 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.948588 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.948808 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.956161 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.956984 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bqkqg"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.957380 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.957904 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.957947 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jjvsm"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.957904 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.958496 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.958672 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.959854 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.960069 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jjvsm" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.968990 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qpsf8"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.969064 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.969076 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fshpg"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.979279 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cvqg4"] Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.995380 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.996798 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.997086 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.997735 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.998270 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 21 04:55:37 crc kubenswrapper[4580]: I0321 04:55:37.999625 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xh9jk"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.000251 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-br8h5"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.000931 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.000954 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.001623 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.001769 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-br8h5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.001876 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.002051 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.002614 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.002985 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.004156 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.004488 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.005172 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.005393 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.005633 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.006479 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.007011 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.008731 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.009712 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.010433 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.010504 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.010585 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.010521 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.010605 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.010744 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.010674 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.010859 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.011082 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.011491 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.011501 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.011641 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.014001 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.030598 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqdxh\" (UniqueName: \"kubernetes.io/projected/794862b0-a985-459a-98d4-cc82612d3593-kube-api-access-kqdxh\") pod \"controller-manager-879f6c89f-xh9jk\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.030660 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b6460aa-400a-4a67-9ac2-93bf4268e610-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vhp42\" (UID: \"7b6460aa-400a-4a67-9ac2-93bf4268e610\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.030703 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edb0a4af-3bf8-4517-af4a-a7546e9acf87-serving-cert\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.030729 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-encryption-config\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.030812 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/edb0a4af-3bf8-4517-af4a-a7546e9acf87-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.030833 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/edb0a4af-3bf8-4517-af4a-a7546e9acf87-audit-dir\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.030856 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/41dae12a-fc3b-4e2b-a64a-f4f4c791afbc-default-certificate\") pod \"router-default-5444994796-dnnd6\" (UID: \"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc\") " pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.030880 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkts9\" (UniqueName: \"kubernetes.io/projected/da00c0bc-2ff1-4b15-be1f-8fac48921976-kube-api-access-gkts9\") pod \"machine-api-operator-5694c8668f-cvqg4\" (UID: \"da00c0bc-2ff1-4b15-be1f-8fac48921976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.030899 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a0187cd-b67d-47ee-a791-08939d1b4cc5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cl7vz\" (UID: \"8a0187cd-b67d-47ee-a791-08939d1b4cc5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.030918 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a87434a7-bb73-4552-a502-c1a31119cff7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hcpgb\" (UID: \"a87434a7-bb73-4552-a502-c1a31119cff7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.030948 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zdqm\" (UniqueName: \"kubernetes.io/projected/c9c2c4c6-575b-4b3a-91ba-f694e4005859-kube-api-access-5zdqm\") pod \"openshift-apiserver-operator-796bbdcf4f-jhb4w\" (UID: \"c9c2c4c6-575b-4b3a-91ba-f694e4005859\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.030973 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-audit-dir\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031001 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031023 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/41dae12a-fc3b-4e2b-a64a-f4f4c791afbc-stats-auth\") pod \"router-default-5444994796-dnnd6\" (UID: \"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc\") " pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031058 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-client-ca\") pod \"route-controller-manager-6576b87f9c-fkfvp\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031083 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aabdab2-8553-4784-ac9e-8d1a42b5d32b-config\") pod \"console-operator-58897d9998-qpsf8\" (UID: \"5aabdab2-8553-4784-ac9e-8d1a42b5d32b\") " pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031105 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/794862b0-a985-459a-98d4-cc82612d3593-serving-cert\") pod \"controller-manager-879f6c89f-xh9jk\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031123 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-serving-cert\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031154 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-image-import-ca\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031178 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031207 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9c2c4c6-575b-4b3a-91ba-f694e4005859-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jhb4w\" (UID: \"c9c2c4c6-575b-4b3a-91ba-f694e4005859\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031233 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca24069e-6f60-4d3b-950e-bf6a87fa1955-config\") pod \"kube-apiserver-operator-766d6c64bb-m4kxh\" (UID: \"ca24069e-6f60-4d3b-950e-bf6a87fa1955\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031257 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a87434a7-bb73-4552-a502-c1a31119cff7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hcpgb\" (UID: \"a87434a7-bb73-4552-a502-c1a31119cff7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031280 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-etcd-serving-ca\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031303 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-serving-cert\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031329 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dd075d6a-7d15-421a-b546-19c5cab789d3-machine-approver-tls\") pod \"machine-approver-56656f9798-xgchr\" (UID: \"dd075d6a-7d15-421a-b546-19c5cab789d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031366 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzqwz\" (UniqueName: \"kubernetes.io/projected/8a0187cd-b67d-47ee-a791-08939d1b4cc5-kube-api-access-xzqwz\") pod \"cluster-image-registry-operator-dc59b4c8b-cl7vz\" (UID: \"8a0187cd-b67d-47ee-a791-08939d1b4cc5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031400 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptcjw\" (UniqueName: \"kubernetes.io/projected/f66c5870-b6ec-4673-8229-259391f6dada-kube-api-access-ptcjw\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwprr\" (UID: \"f66c5870-b6ec-4673-8229-259391f6dada\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031430 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-etcd-client\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031453 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzvkb\" (UniqueName: \"kubernetes.io/projected/69b1f163-8594-47b1-85c7-3330e0d50d8f-kube-api-access-gzvkb\") pod \"downloads-7954f5f757-4ffj8\" (UID: \"69b1f163-8594-47b1-85c7-3330e0d50d8f\") " pod="openshift-console/downloads-7954f5f757-4ffj8" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031477 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nvn7\" (UniqueName: \"kubernetes.io/projected/dd075d6a-7d15-421a-b546-19c5cab789d3-kube-api-access-9nvn7\") pod \"machine-approver-56656f9798-xgchr\" (UID: \"dd075d6a-7d15-421a-b546-19c5cab789d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031508 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031530 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031552 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca24069e-6f60-4d3b-950e-bf6a87fa1955-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-m4kxh\" (UID: \"ca24069e-6f60-4d3b-950e-bf6a87fa1955\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031592 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/edb0a4af-3bf8-4517-af4a-a7546e9acf87-audit-policies\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031616 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b6460aa-400a-4a67-9ac2-93bf4268e610-serving-cert\") pod \"authentication-operator-69f744f599-vhp42\" (UID: \"7b6460aa-400a-4a67-9ac2-93bf4268e610\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031643 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8fcs\" (UniqueName: \"kubernetes.io/projected/edb0a4af-3bf8-4517-af4a-a7546e9acf87-kube-api-access-n8fcs\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031668 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-config\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031693 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05d9d8f9-d2d0-48ad-9583-5caaf4675cd4-serving-cert\") pod \"openshift-config-operator-7777fb866f-c8fh9\" (UID: \"05d9d8f9-d2d0-48ad-9583-5caaf4675cd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031715 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031747 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f66c5870-b6ec-4673-8229-259391f6dada-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwprr\" (UID: \"f66c5870-b6ec-4673-8229-259391f6dada\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031775 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d8jb\" (UniqueName: \"kubernetes.io/projected/5aabdab2-8553-4784-ac9e-8d1a42b5d32b-kube-api-access-8d8jb\") pod \"console-operator-58897d9998-qpsf8\" (UID: \"5aabdab2-8553-4784-ac9e-8d1a42b5d32b\") " pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031846 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/da00c0bc-2ff1-4b15-be1f-8fac48921976-images\") pod \"machine-api-operator-5694c8668f-cvqg4\" (UID: \"da00c0bc-2ff1-4b15-be1f-8fac48921976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031884 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-config\") pod \"route-controller-manager-6576b87f9c-fkfvp\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031943 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-etcd-client\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031969 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-etcd-service-ca\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.031996 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032022 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032046 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-oauth-serving-cert\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032068 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd075d6a-7d15-421a-b546-19c5cab789d3-config\") pod \"machine-approver-56656f9798-xgchr\" (UID: \"dd075d6a-7d15-421a-b546-19c5cab789d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032102 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9c2c4c6-575b-4b3a-91ba-f694e4005859-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jhb4w\" (UID: \"c9c2c4c6-575b-4b3a-91ba-f694e4005859\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032126 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-audit-policies\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032126 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/edb0a4af-3bf8-4517-af4a-a7546e9acf87-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032152 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-console-config\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032176 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a87434a7-bb73-4552-a502-c1a31119cff7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hcpgb\" (UID: \"a87434a7-bb73-4552-a502-c1a31119cff7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032212 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-audit-dir\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032240 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032267 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a0187cd-b67d-47ee-a791-08939d1b4cc5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cl7vz\" (UID: \"8a0187cd-b67d-47ee-a791-08939d1b4cc5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032290 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032314 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p87tw\" (UniqueName: \"kubernetes.io/projected/7b6460aa-400a-4a67-9ac2-93bf4268e610-kube-api-access-p87tw\") pod \"authentication-operator-69f744f599-vhp42\" (UID: \"7b6460aa-400a-4a67-9ac2-93bf4268e610\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032336 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41dae12a-fc3b-4e2b-a64a-f4f4c791afbc-service-ca-bundle\") pod \"router-default-5444994796-dnnd6\" (UID: \"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc\") " pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032365 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/edb0a4af-3bf8-4517-af4a-a7546e9acf87-encryption-config\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032386 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-node-pullsecrets\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032406 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xh9jk\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032431 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a6d75f9-0ef3-4c99-8d68-7809b17fc607-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vckmv\" (UID: \"3a6d75f9-0ef3-4c99-8d68-7809b17fc607\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vckmv" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032450 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032479 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f66c5870-b6ec-4673-8229-259391f6dada-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwprr\" (UID: \"f66c5870-b6ec-4673-8229-259391f6dada\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032505 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-config\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032534 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032559 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-trusted-ca-bundle\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032604 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt4pb\" (UniqueName: \"kubernetes.io/projected/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-kube-api-access-jt4pb\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032626 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrt66\" (UniqueName: \"kubernetes.io/projected/dcd68a9c-6c86-41a1-9d04-309a7db16685-kube-api-access-mrt66\") pod \"dns-operator-744455d44c-p8tbn\" (UID: \"dcd68a9c-6c86-41a1-9d04-309a7db16685\") " pod="openshift-dns-operator/dns-operator-744455d44c-p8tbn" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032649 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41dae12a-fc3b-4e2b-a64a-f4f4c791afbc-metrics-certs\") pod \"router-default-5444994796-dnnd6\" (UID: \"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc\") " pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032670 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/05d9d8f9-d2d0-48ad-9583-5caaf4675cd4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-c8fh9\" (UID: \"05d9d8f9-d2d0-48ad-9583-5caaf4675cd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032688 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da00c0bc-2ff1-4b15-be1f-8fac48921976-config\") pod \"machine-api-operator-5694c8668f-cvqg4\" (UID: \"da00c0bc-2ff1-4b15-be1f-8fac48921976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032712 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032750 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5aabdab2-8553-4784-ac9e-8d1a42b5d32b-trusted-ca\") pod \"console-operator-58897d9998-qpsf8\" (UID: \"5aabdab2-8553-4784-ac9e-8d1a42b5d32b\") " pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.032965 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.033048 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/edb0a4af-3bf8-4517-af4a-a7546e9acf87-audit-dir\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.033317 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.033695 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.034470 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-client-ca\") pod \"route-controller-manager-6576b87f9c-fkfvp\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.034659 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aabdab2-8553-4784-ac9e-8d1a42b5d32b-config\") pod \"console-operator-58897d9998-qpsf8\" (UID: \"5aabdab2-8553-4784-ac9e-8d1a42b5d32b\") " pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.035150 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/edb0a4af-3bf8-4517-af4a-a7546e9acf87-audit-policies\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.036962 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-image-import-ca\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.037725 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dcd68a9c-6c86-41a1-9d04-309a7db16685-metrics-tls\") pod \"dns-operator-744455d44c-p8tbn\" (UID: \"dcd68a9c-6c86-41a1-9d04-309a7db16685\") " pod="openshift-dns-operator/dns-operator-744455d44c-p8tbn" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.037799 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca24069e-6f60-4d3b-950e-bf6a87fa1955-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-m4kxh\" (UID: \"ca24069e-6f60-4d3b-950e-bf6a87fa1955\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.037826 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-service-ca\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.037855 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcade120-6711-4045-9149-08985699febd-console-serving-cert\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.037898 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/edb0a4af-3bf8-4517-af4a-a7546e9acf87-etcd-client\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.037904 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.037928 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-serving-cert\") pod \"route-controller-manager-6576b87f9c-fkfvp\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.037959 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq266\" (UniqueName: \"kubernetes.io/projected/05d9d8f9-d2d0-48ad-9583-5caaf4675cd4-kube-api-access-pq266\") pod \"openshift-config-operator-7777fb866f-c8fh9\" (UID: \"05d9d8f9-d2d0-48ad-9583-5caaf4675cd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.037986 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bcade120-6711-4045-9149-08985699febd-console-oauth-config\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038023 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q5sw\" (UniqueName: \"kubernetes.io/projected/bcade120-6711-4045-9149-08985699febd-kube-api-access-8q5sw\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038052 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-audit\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038075 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b6460aa-400a-4a67-9ac2-93bf4268e610-config\") pod \"authentication-operator-69f744f599-vhp42\" (UID: \"7b6460aa-400a-4a67-9ac2-93bf4268e610\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038094 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8a0187cd-b67d-47ee-a791-08939d1b4cc5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cl7vz\" (UID: \"8a0187cd-b67d-47ee-a791-08939d1b4cc5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038114 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/edb0a4af-3bf8-4517-af4a-a7546e9acf87-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038136 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-client-ca\") pod \"controller-manager-879f6c89f-xh9jk\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038168 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-etcd-ca\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038187 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2px2k\" (UniqueName: \"kubernetes.io/projected/3a6d75f9-0ef3-4c99-8d68-7809b17fc607-kube-api-access-2px2k\") pod \"cluster-samples-operator-665b6dd947-vckmv\" (UID: \"3a6d75f9-0ef3-4c99-8d68-7809b17fc607\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vckmv" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038209 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aabdab2-8553-4784-ac9e-8d1a42b5d32b-serving-cert\") pod \"console-operator-58897d9998-qpsf8\" (UID: \"5aabdab2-8553-4784-ac9e-8d1a42b5d32b\") " pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038261 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnjwx\" (UniqueName: \"kubernetes.io/projected/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-kube-api-access-rnjwx\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038291 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/da00c0bc-2ff1-4b15-be1f-8fac48921976-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cvqg4\" (UID: \"da00c0bc-2ff1-4b15-be1f-8fac48921976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038315 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b6460aa-400a-4a67-9ac2-93bf4268e610-service-ca-bundle\") pod \"authentication-operator-69f744f599-vhp42\" (UID: \"7b6460aa-400a-4a67-9ac2-93bf4268e610\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038334 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gnjn\" (UniqueName: \"kubernetes.io/projected/41dae12a-fc3b-4e2b-a64a-f4f4c791afbc-kube-api-access-8gnjn\") pod \"router-default-5444994796-dnnd6\" (UID: \"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc\") " pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038352 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd075d6a-7d15-421a-b546-19c5cab789d3-auth-proxy-config\") pod \"machine-approver-56656f9798-xgchr\" (UID: \"dd075d6a-7d15-421a-b546-19c5cab789d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038378 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-config\") pod \"controller-manager-879f6c89f-xh9jk\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038395 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp55m\" (UniqueName: \"kubernetes.io/projected/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-kube-api-access-vp55m\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.038426 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6c99\" (UniqueName: \"kubernetes.io/projected/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-kube-api-access-p6c99\") pod \"route-controller-manager-6576b87f9c-fkfvp\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.040857 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/794862b0-a985-459a-98d4-cc82612d3593-serving-cert\") pod \"controller-manager-879f6c89f-xh9jk\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.041479 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-etcd-serving-ca\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.042830 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/da00c0bc-2ff1-4b15-be1f-8fac48921976-images\") pod \"machine-api-operator-5694c8668f-cvqg4\" (UID: \"da00c0bc-2ff1-4b15-be1f-8fac48921976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.043760 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-config\") pod \"route-controller-manager-6576b87f9c-fkfvp\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.044441 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-serving-cert\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.044965 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f66c5870-b6ec-4673-8229-259391f6dada-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwprr\" (UID: \"f66c5870-b6ec-4673-8229-259391f6dada\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.045300 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/edb0a4af-3bf8-4517-af4a-a7546e9acf87-encryption-config\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.045378 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-node-pullsecrets\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.046559 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xh9jk\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.047216 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.047433 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f66c5870-b6ec-4673-8229-259391f6dada-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwprr\" (UID: \"f66c5870-b6ec-4673-8229-259391f6dada\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.047946 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-config\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.048986 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da00c0bc-2ff1-4b15-be1f-8fac48921976-config\") pod \"machine-api-operator-5694c8668f-cvqg4\" (UID: \"da00c0bc-2ff1-4b15-be1f-8fac48921976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.049806 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5aabdab2-8553-4784-ac9e-8d1a42b5d32b-trusted-ca\") pod \"console-operator-58897d9998-qpsf8\" (UID: \"5aabdab2-8553-4784-ac9e-8d1a42b5d32b\") " pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.050151 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.050308 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.053261 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9c2c4c6-575b-4b3a-91ba-f694e4005859-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jhb4w\" (UID: \"c9c2c4c6-575b-4b3a-91ba-f694e4005859\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.053372 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.053446 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.053933 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-audit-dir\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.055175 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.055661 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.055958 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vhp42"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.056111 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.057410 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.059025 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-48dqz"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.059770 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p8tbn"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.060343 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/edb0a4af-3bf8-4517-af4a-a7546e9acf87-etcd-client\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.061446 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-config\") pod \"controller-manager-879f6c89f-xh9jk\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.067101 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-br8h5"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.067230 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.067356 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.068004 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.068926 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.069149 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/edb0a4af-3bf8-4517-af4a-a7546e9acf87-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.069804 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-client-ca\") pod \"controller-manager-879f6c89f-xh9jk\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.069876 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hnl8s"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.081486 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.082051 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.082295 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.082693 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.082763 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.083245 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.083464 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.084983 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-audit\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.090930 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.091488 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.106024 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/da00c0bc-2ff1-4b15-be1f-8fac48921976-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cvqg4\" (UID: \"da00c0bc-2ff1-4b15-be1f-8fac48921976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.106440 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-encryption-config\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.107423 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edb0a4af-3bf8-4517-af4a-a7546e9acf87-serving-cert\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.107535 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-etcd-client\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.108031 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.109647 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.109846 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9c2c4c6-575b-4b3a-91ba-f694e4005859-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jhb4w\" (UID: \"c9c2c4c6-575b-4b3a-91ba-f694e4005859\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.108033 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-serving-cert\") pod \"route-controller-manager-6576b87f9c-fkfvp\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.108148 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.110081 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aabdab2-8553-4784-ac9e-8d1a42b5d32b-serving-cert\") pod \"console-operator-58897d9998-qpsf8\" (UID: \"5aabdab2-8553-4784-ac9e-8d1a42b5d32b\") " pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.110402 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.111224 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.113720 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bqkqg"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.118121 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.121042 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jnpfl"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.123426 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-jnpfl" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.124289 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.134512 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gltvw"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.141643 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gltvw" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.142473 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.142520 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a0187cd-b67d-47ee-a791-08939d1b4cc5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cl7vz\" (UID: \"8a0187cd-b67d-47ee-a791-08939d1b4cc5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.142580 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a87434a7-bb73-4552-a502-c1a31119cff7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hcpgb\" (UID: \"a87434a7-bb73-4552-a502-c1a31119cff7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.142634 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.142667 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p87tw\" (UniqueName: \"kubernetes.io/projected/7b6460aa-400a-4a67-9ac2-93bf4268e610-kube-api-access-p87tw\") pod \"authentication-operator-69f744f599-vhp42\" (UID: \"7b6460aa-400a-4a67-9ac2-93bf4268e610\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.142707 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41dae12a-fc3b-4e2b-a64a-f4f4c791afbc-service-ca-bundle\") pod \"router-default-5444994796-dnnd6\" (UID: \"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc\") " pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.142736 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a6d75f9-0ef3-4c99-8d68-7809b17fc607-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vckmv\" (UID: \"3a6d75f9-0ef3-4c99-8d68-7809b17fc607\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vckmv" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.142767 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.142810 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt4pb\" (UniqueName: \"kubernetes.io/projected/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-kube-api-access-jt4pb\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.142859 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.142901 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-trusted-ca-bundle\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.142947 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.142985 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrt66\" (UniqueName: \"kubernetes.io/projected/dcd68a9c-6c86-41a1-9d04-309a7db16685-kube-api-access-mrt66\") pod \"dns-operator-744455d44c-p8tbn\" (UID: \"dcd68a9c-6c86-41a1-9d04-309a7db16685\") " pod="openshift-dns-operator/dns-operator-744455d44c-p8tbn" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143015 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41dae12a-fc3b-4e2b-a64a-f4f4c791afbc-metrics-certs\") pod \"router-default-5444994796-dnnd6\" (UID: \"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc\") " pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143037 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/05d9d8f9-d2d0-48ad-9583-5caaf4675cd4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-c8fh9\" (UID: \"05d9d8f9-d2d0-48ad-9583-5caaf4675cd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143070 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dcd68a9c-6c86-41a1-9d04-309a7db16685-metrics-tls\") pod \"dns-operator-744455d44c-p8tbn\" (UID: \"dcd68a9c-6c86-41a1-9d04-309a7db16685\") " pod="openshift-dns-operator/dns-operator-744455d44c-p8tbn" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143096 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-service-ca\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143118 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca24069e-6f60-4d3b-950e-bf6a87fa1955-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-m4kxh\" (UID: \"ca24069e-6f60-4d3b-950e-bf6a87fa1955\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143142 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq266\" (UniqueName: \"kubernetes.io/projected/05d9d8f9-d2d0-48ad-9583-5caaf4675cd4-kube-api-access-pq266\") pod \"openshift-config-operator-7777fb866f-c8fh9\" (UID: \"05d9d8f9-d2d0-48ad-9583-5caaf4675cd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143165 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcade120-6711-4045-9149-08985699febd-console-serving-cert\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143190 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b6460aa-400a-4a67-9ac2-93bf4268e610-config\") pod \"authentication-operator-69f744f599-vhp42\" (UID: \"7b6460aa-400a-4a67-9ac2-93bf4268e610\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143222 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8a0187cd-b67d-47ee-a791-08939d1b4cc5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cl7vz\" (UID: \"8a0187cd-b67d-47ee-a791-08939d1b4cc5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143252 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bcade120-6711-4045-9149-08985699febd-console-oauth-config\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143280 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q5sw\" (UniqueName: \"kubernetes.io/projected/bcade120-6711-4045-9149-08985699febd-kube-api-access-8q5sw\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143306 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2px2k\" (UniqueName: \"kubernetes.io/projected/3a6d75f9-0ef3-4c99-8d68-7809b17fc607-kube-api-access-2px2k\") pod \"cluster-samples-operator-665b6dd947-vckmv\" (UID: \"3a6d75f9-0ef3-4c99-8d68-7809b17fc607\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vckmv" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143334 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-etcd-ca\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143371 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gnjn\" (UniqueName: \"kubernetes.io/projected/41dae12a-fc3b-4e2b-a64a-f4f4c791afbc-kube-api-access-8gnjn\") pod \"router-default-5444994796-dnnd6\" (UID: \"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc\") " pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143405 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b6460aa-400a-4a67-9ac2-93bf4268e610-service-ca-bundle\") pod \"authentication-operator-69f744f599-vhp42\" (UID: \"7b6460aa-400a-4a67-9ac2-93bf4268e610\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143430 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp55m\" (UniqueName: \"kubernetes.io/projected/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-kube-api-access-vp55m\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143468 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd075d6a-7d15-421a-b546-19c5cab789d3-auth-proxy-config\") pod \"machine-approver-56656f9798-xgchr\" (UID: \"dd075d6a-7d15-421a-b546-19c5cab789d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143523 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b6460aa-400a-4a67-9ac2-93bf4268e610-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vhp42\" (UID: \"7b6460aa-400a-4a67-9ac2-93bf4268e610\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143590 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/41dae12a-fc3b-4e2b-a64a-f4f4c791afbc-default-certificate\") pod \"router-default-5444994796-dnnd6\" (UID: \"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc\") " pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143747 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a0187cd-b67d-47ee-a791-08939d1b4cc5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cl7vz\" (UID: \"8a0187cd-b67d-47ee-a791-08939d1b4cc5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143825 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a87434a7-bb73-4552-a502-c1a31119cff7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hcpgb\" (UID: \"a87434a7-bb73-4552-a502-c1a31119cff7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143903 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-audit-dir\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.143938 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144013 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/41dae12a-fc3b-4e2b-a64a-f4f4c791afbc-stats-auth\") pod \"router-default-5444994796-dnnd6\" (UID: \"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc\") " pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144048 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-serving-cert\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144125 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca24069e-6f60-4d3b-950e-bf6a87fa1955-config\") pod \"kube-apiserver-operator-766d6c64bb-m4kxh\" (UID: \"ca24069e-6f60-4d3b-950e-bf6a87fa1955\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144162 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a87434a7-bb73-4552-a502-c1a31119cff7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hcpgb\" (UID: \"a87434a7-bb73-4552-a502-c1a31119cff7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144204 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dd075d6a-7d15-421a-b546-19c5cab789d3-machine-approver-tls\") pod \"machine-approver-56656f9798-xgchr\" (UID: \"dd075d6a-7d15-421a-b546-19c5cab789d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144360 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-etcd-client\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144447 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzqwz\" (UniqueName: \"kubernetes.io/projected/8a0187cd-b67d-47ee-a791-08939d1b4cc5-kube-api-access-xzqwz\") pod \"cluster-image-registry-operator-dc59b4c8b-cl7vz\" (UID: \"8a0187cd-b67d-47ee-a791-08939d1b4cc5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144506 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144548 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144579 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzvkb\" (UniqueName: \"kubernetes.io/projected/69b1f163-8594-47b1-85c7-3330e0d50d8f-kube-api-access-gzvkb\") pod \"downloads-7954f5f757-4ffj8\" (UID: \"69b1f163-8594-47b1-85c7-3330e0d50d8f\") " pod="openshift-console/downloads-7954f5f757-4ffj8" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144609 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nvn7\" (UniqueName: \"kubernetes.io/projected/dd075d6a-7d15-421a-b546-19c5cab789d3-kube-api-access-9nvn7\") pod \"machine-approver-56656f9798-xgchr\" (UID: \"dd075d6a-7d15-421a-b546-19c5cab789d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144680 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b6460aa-400a-4a67-9ac2-93bf4268e610-serving-cert\") pod \"authentication-operator-69f744f599-vhp42\" (UID: \"7b6460aa-400a-4a67-9ac2-93bf4268e610\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144707 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca24069e-6f60-4d3b-950e-bf6a87fa1955-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-m4kxh\" (UID: \"ca24069e-6f60-4d3b-950e-bf6a87fa1955\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144769 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-config\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144815 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05d9d8f9-d2d0-48ad-9583-5caaf4675cd4-serving-cert\") pod \"openshift-config-operator-7777fb866f-c8fh9\" (UID: \"05d9d8f9-d2d0-48ad-9583-5caaf4675cd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144899 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144945 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-etcd-service-ca\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.144968 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.142516 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.146294 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567814-8cxbg"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.146395 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a87434a7-bb73-4552-a502-c1a31119cff7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hcpgb\" (UID: \"a87434a7-bb73-4552-a502-c1a31119cff7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.147038 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/05d9d8f9-d2d0-48ad-9583-5caaf4675cd4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-c8fh9\" (UID: \"05d9d8f9-d2d0-48ad-9583-5caaf4675cd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.149653 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.150430 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-service-ca\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.153425 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a0187cd-b67d-47ee-a791-08939d1b4cc5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cl7vz\" (UID: \"8a0187cd-b67d-47ee-a791-08939d1b4cc5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.153829 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-etcd-ca\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.155710 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-trusted-ca-bundle\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.156349 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bcade120-6711-4045-9149-08985699febd-console-oauth-config\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.157092 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-8cxbg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.158122 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.158333 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-oauth-serving-cert\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.158501 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-audit-policies\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.158673 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-console-config\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.158866 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd075d6a-7d15-421a-b546-19c5cab789d3-config\") pod \"machine-approver-56656f9798-xgchr\" (UID: \"dd075d6a-7d15-421a-b546-19c5cab789d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.159484 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dcd68a9c-6c86-41a1-9d04-309a7db16685-metrics-tls\") pod \"dns-operator-744455d44c-p8tbn\" (UID: \"dcd68a9c-6c86-41a1-9d04-309a7db16685\") " pod="openshift-dns-operator/dns-operator-744455d44c-p8tbn" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.159769 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.160020 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd075d6a-7d15-421a-b546-19c5cab789d3-config\") pod \"machine-approver-56656f9798-xgchr\" (UID: \"dd075d6a-7d15-421a-b546-19c5cab789d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.161333 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.165717 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-oauth-serving-cert\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.166176 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-audit-policies\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.166685 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-console-config\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.158731 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd075d6a-7d15-421a-b546-19c5cab789d3-auth-proxy-config\") pod \"machine-approver-56656f9798-xgchr\" (UID: \"dd075d6a-7d15-421a-b546-19c5cab789d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.170541 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-audit-dir\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.173070 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05d9d8f9-d2d0-48ad-9583-5caaf4675cd4-serving-cert\") pod \"openshift-config-operator-7777fb866f-c8fh9\" (UID: \"05d9d8f9-d2d0-48ad-9583-5caaf4675cd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.173936 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.174315 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a0187cd-b67d-47ee-a791-08939d1b4cc5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cl7vz\" (UID: \"8a0187cd-b67d-47ee-a791-08939d1b4cc5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.175310 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-etcd-service-ca\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.175589 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.175933 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.176125 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-config\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.176238 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dd075d6a-7d15-421a-b546-19c5cab789d3-machine-approver-tls\") pod \"machine-approver-56656f9798-xgchr\" (UID: \"dd075d6a-7d15-421a-b546-19c5cab789d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.176806 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a87434a7-bb73-4552-a502-c1a31119cff7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hcpgb\" (UID: \"a87434a7-bb73-4552-a502-c1a31119cff7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.177061 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.177375 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.177729 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.177725 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-serving-cert\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.178145 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-etcd-client\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.178583 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.179904 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a6d75f9-0ef3-4c99-8d68-7809b17fc607-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vckmv\" (UID: \"3a6d75f9-0ef3-4c99-8d68-7809b17fc607\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vckmv" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.180069 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kmfk5"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.180260 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.180766 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.181979 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.182309 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcade120-6711-4045-9149-08985699febd-console-serving-cert\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.183148 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rsrjh"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.185518 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.186928 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.189235 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rsrjh" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.191141 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b6460aa-400a-4a67-9ac2-93bf4268e610-serving-cert\") pod \"authentication-operator-69f744f599-vhp42\" (UID: \"7b6460aa-400a-4a67-9ac2-93bf4268e610\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.195655 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vckmv"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.198243 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.201177 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.203061 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.203760 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.204835 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.207025 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9csml"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.209190 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9csml" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.210948 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.213479 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.214768 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-xd8hf"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.215659 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xd8hf" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.216116 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4ffj8"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.219927 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.221041 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2rbc4"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.224495 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.225388 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.227954 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.230066 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.232386 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.232390 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b6460aa-400a-4a67-9ac2-93bf4268e610-config\") pod \"authentication-operator-69f744f599-vhp42\" (UID: \"7b6460aa-400a-4a67-9ac2-93bf4268e610\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.234386 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.235772 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gltvw"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.237286 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jjvsm"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.238632 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.240258 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rsrjh"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.241609 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hnl8s"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.243049 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9csml"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.244546 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-8cxbg"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.246874 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.248367 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rrbz2"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.250582 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jnpfl"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.250721 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.252150 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rrbz2"] Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.255009 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.261034 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b6460aa-400a-4a67-9ac2-93bf4268e610-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vhp42\" (UID: \"7b6460aa-400a-4a67-9ac2-93bf4268e610\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.265513 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.273017 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b6460aa-400a-4a67-9ac2-93bf4268e610-service-ca-bundle\") pod \"authentication-operator-69f744f599-vhp42\" (UID: \"7b6460aa-400a-4a67-9ac2-93bf4268e610\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.284095 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.304258 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.323813 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.329107 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41dae12a-fc3b-4e2b-a64a-f4f4c791afbc-service-ca-bundle\") pod \"router-default-5444994796-dnnd6\" (UID: \"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc\") " pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.344708 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.352638 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/41dae12a-fc3b-4e2b-a64a-f4f4c791afbc-default-certificate\") pod \"router-default-5444994796-dnnd6\" (UID: \"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc\") " pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.364199 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.368512 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/41dae12a-fc3b-4e2b-a64a-f4f4c791afbc-stats-auth\") pod \"router-default-5444994796-dnnd6\" (UID: \"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc\") " pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.383942 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.397153 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41dae12a-fc3b-4e2b-a64a-f4f4c791afbc-metrics-certs\") pod \"router-default-5444994796-dnnd6\" (UID: \"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc\") " pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.405351 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.424756 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.444392 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.453916 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca24069e-6f60-4d3b-950e-bf6a87fa1955-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-m4kxh\" (UID: \"ca24069e-6f60-4d3b-950e-bf6a87fa1955\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.464340 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.485796 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.494010 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca24069e-6f60-4d3b-950e-bf6a87fa1955-config\") pod \"kube-apiserver-operator-766d6c64bb-m4kxh\" (UID: \"ca24069e-6f60-4d3b-950e-bf6a87fa1955\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.503886 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.523838 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.545510 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.564214 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.584578 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.604738 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.624546 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.644936 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.664920 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.684467 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.703940 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.723891 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.751372 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.764441 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.784961 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.804985 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.824673 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.843748 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.864091 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.884968 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.924340 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.944951 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 21 04:55:38 crc kubenswrapper[4580]: I0321 04:55:38.980484 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkts9\" (UniqueName: \"kubernetes.io/projected/da00c0bc-2ff1-4b15-be1f-8fac48921976-kube-api-access-gkts9\") pod \"machine-api-operator-5694c8668f-cvqg4\" (UID: \"da00c0bc-2ff1-4b15-be1f-8fac48921976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.000880 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zdqm\" (UniqueName: \"kubernetes.io/projected/c9c2c4c6-575b-4b3a-91ba-f694e4005859-kube-api-access-5zdqm\") pod \"openshift-apiserver-operator-796bbdcf4f-jhb4w\" (UID: \"c9c2c4c6-575b-4b3a-91ba-f694e4005859\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.020266 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptcjw\" (UniqueName: \"kubernetes.io/projected/f66c5870-b6ec-4673-8229-259391f6dada-kube-api-access-ptcjw\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwprr\" (UID: \"f66c5870-b6ec-4673-8229-259391f6dada\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.041747 4580 request.go:700] Waited for 1.004539525s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.044434 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8fcs\" (UniqueName: \"kubernetes.io/projected/edb0a4af-3bf8-4517-af4a-a7546e9acf87-kube-api-access-n8fcs\") pod \"apiserver-7bbb656c7d-zgk58\" (UID: \"edb0a4af-3bf8-4517-af4a-a7546e9acf87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.062023 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqdxh\" (UniqueName: \"kubernetes.io/projected/794862b0-a985-459a-98d4-cc82612d3593-kube-api-access-kqdxh\") pod \"controller-manager-879f6c89f-xh9jk\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.076549 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.090986 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6c99\" (UniqueName: \"kubernetes.io/projected/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-kube-api-access-p6c99\") pod \"route-controller-manager-6576b87f9c-fkfvp\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.105243 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.106854 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d8jb\" (UniqueName: \"kubernetes.io/projected/5aabdab2-8553-4784-ac9e-8d1a42b5d32b-kube-api-access-8d8jb\") pod \"console-operator-58897d9998-qpsf8\" (UID: \"5aabdab2-8553-4784-ac9e-8d1a42b5d32b\") " pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.114967 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.124813 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.141355 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.145074 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.151700 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.164528 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.170492 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.184076 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.198556 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.242361 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnjwx\" (UniqueName: \"kubernetes.io/projected/1b720ad7-2de4-43c9-bab3-81c68d5dfde7-kube-api-access-rnjwx\") pod \"apiserver-76f77b778f-fshpg\" (UID: \"1b720ad7-2de4-43c9-bab3-81c68d5dfde7\") " pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.244315 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.266368 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.289803 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.307400 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.324775 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.343427 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.345315 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.366881 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.391086 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.406025 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.407390 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.423847 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.441999 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58"] Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.444890 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.474443 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.485083 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.505147 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.532607 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.543748 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.549105 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cvqg4"] Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.550767 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr"] Mar 21 04:55:39 crc kubenswrapper[4580]: W0321 04:55:39.560410 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda00c0bc_2ff1_4b15_be1f_8fac48921976.slice/crio-c02475296be39bb890346198d1e9e23d2fe14d793156e72549e4456c8bbbeaa0 WatchSource:0}: Error finding container c02475296be39bb890346198d1e9e23d2fe14d793156e72549e4456c8bbbeaa0: Status 404 returned error can't find the container with id c02475296be39bb890346198d1e9e23d2fe14d793156e72549e4456c8bbbeaa0 Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.567266 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 04:55:39 crc kubenswrapper[4580]: W0321 04:55:39.578175 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf66c5870_b6ec_4673_8229_259391f6dada.slice/crio-41b54b00d3162c9a5a842b9bc1e805c5067dccadc7e7534db9e1cbec3d0f76b4 WatchSource:0}: Error finding container 41b54b00d3162c9a5a842b9bc1e805c5067dccadc7e7534db9e1cbec3d0f76b4: Status 404 returned error can't find the container with id 41b54b00d3162c9a5a842b9bc1e805c5067dccadc7e7534db9e1cbec3d0f76b4 Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.582125 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp"] Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.584235 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 04:55:39 crc kubenswrapper[4580]: W0321 04:55:39.593048 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f5c10bd_9b6e_41dc_b0a9_30b5f59ad6cc.slice/crio-571bec49042e797343b28addacb4e9dc7ba0af0a2c60b367dabf355bf9aa4fc3 WatchSource:0}: Error finding container 571bec49042e797343b28addacb4e9dc7ba0af0a2c60b367dabf355bf9aa4fc3: Status 404 returned error can't find the container with id 571bec49042e797343b28addacb4e9dc7ba0af0a2c60b367dabf355bf9aa4fc3 Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.609834 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.623963 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.645033 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.660019 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fshpg"] Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.665448 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.685423 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.689109 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w"] Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.693583 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qpsf8"] Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.710701 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.726839 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xh9jk"] Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.729945 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 21 04:55:39 crc kubenswrapper[4580]: W0321 04:55:39.738193 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aabdab2_8553_4784_ac9e_8d1a42b5d32b.slice/crio-84783025fa5bf3557757e17639717cda945d130138b3bfa2b95ef4a700e57d5b WatchSource:0}: Error finding container 84783025fa5bf3557757e17639717cda945d130138b3bfa2b95ef4a700e57d5b: Status 404 returned error can't find the container with id 84783025fa5bf3557757e17639717cda945d130138b3bfa2b95ef4a700e57d5b Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.745604 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.763238 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.807838 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gnjn\" (UniqueName: \"kubernetes.io/projected/41dae12a-fc3b-4e2b-a64a-f4f4c791afbc-kube-api-access-8gnjn\") pod \"router-default-5444994796-dnnd6\" (UID: \"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc\") " pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.854999 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p87tw\" (UniqueName: \"kubernetes.io/projected/7b6460aa-400a-4a67-9ac2-93bf4268e610-kube-api-access-p87tw\") pod \"authentication-operator-69f744f599-vhp42\" (UID: \"7b6460aa-400a-4a67-9ac2-93bf4268e610\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.859416 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq266\" (UniqueName: \"kubernetes.io/projected/05d9d8f9-d2d0-48ad-9583-5caaf4675cd4-kube-api-access-pq266\") pod \"openshift-config-operator-7777fb866f-c8fh9\" (UID: \"05d9d8f9-d2d0-48ad-9583-5caaf4675cd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.862879 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" event={"ID":"794862b0-a985-459a-98d4-cc82612d3593","Type":"ContainerStarted","Data":"219ebaba072964d000a1323fd9f0aae6f78872e382b87a95841b92ceefe9e8dc"} Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.864154 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w" event={"ID":"c9c2c4c6-575b-4b3a-91ba-f694e4005859","Type":"ContainerStarted","Data":"11a110c26372335555358aae2ca2f8bb19b233dbed2899085e0257ff1b46df0a"} Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.864866 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qpsf8" event={"ID":"5aabdab2-8553-4784-ac9e-8d1a42b5d32b","Type":"ContainerStarted","Data":"84783025fa5bf3557757e17639717cda945d130138b3bfa2b95ef4a700e57d5b"} Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.868309 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fshpg" event={"ID":"1b720ad7-2de4-43c9-bab3-81c68d5dfde7","Type":"ContainerStarted","Data":"b8048e19068ff60f4db2e3731080021b2e8c8cd15da33cb2b37a5516d6484d0c"} Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.870962 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" event={"ID":"da00c0bc-2ff1-4b15-be1f-8fac48921976","Type":"ContainerStarted","Data":"ee437bdfb3317f4809a647fe50b64f591b046426db7700826445fdacc3d91cb1"} Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.870994 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" event={"ID":"da00c0bc-2ff1-4b15-be1f-8fac48921976","Type":"ContainerStarted","Data":"c02475296be39bb890346198d1e9e23d2fe14d793156e72549e4456c8bbbeaa0"} Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.877382 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" event={"ID":"edb0a4af-3bf8-4517-af4a-a7546e9acf87","Type":"ContainerStarted","Data":"b1f416411c90759479c8ea5a7b2685d848885d049116d386cee34e138b3a47b7"} Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.881620 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q5sw\" (UniqueName: \"kubernetes.io/projected/bcade120-6711-4045-9149-08985699febd-kube-api-access-8q5sw\") pod \"console-f9d7485db-48dqz\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.884029 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr" event={"ID":"f66c5870-b6ec-4673-8229-259391f6dada","Type":"ContainerStarted","Data":"a0d6b994df242ab2a29f8fd7b34743044f1623f64c7f4bed4aa5e424e3981eff"} Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.884106 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr" event={"ID":"f66c5870-b6ec-4673-8229-259391f6dada","Type":"ContainerStarted","Data":"41b54b00d3162c9a5a842b9bc1e805c5067dccadc7e7534db9e1cbec3d0f76b4"} Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.885695 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrt66\" (UniqueName: \"kubernetes.io/projected/dcd68a9c-6c86-41a1-9d04-309a7db16685-kube-api-access-mrt66\") pod \"dns-operator-744455d44c-p8tbn\" (UID: \"dcd68a9c-6c86-41a1-9d04-309a7db16685\") " pod="openshift-dns-operator/dns-operator-744455d44c-p8tbn" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.893324 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" event={"ID":"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc","Type":"ContainerStarted","Data":"49544dea4693e0ee539af213e7cbfcc9b1333d129a0e81353ce02c7b2902b6c9"} Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.893392 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" event={"ID":"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc","Type":"ContainerStarted","Data":"571bec49042e797343b28addacb4e9dc7ba0af0a2c60b367dabf355bf9aa4fc3"} Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.894540 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.901138 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt4pb\" (UniqueName: \"kubernetes.io/projected/8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa-kube-api-access-jt4pb\") pod \"etcd-operator-b45778765-2rbc4\" (UID: \"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.906744 4580 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-fkfvp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.906946 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" podUID="8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.910059 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.930678 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8a0187cd-b67d-47ee-a791-08939d1b4cc5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cl7vz\" (UID: \"8a0187cd-b67d-47ee-a791-08939d1b4cc5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.942217 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2px2k\" (UniqueName: \"kubernetes.io/projected/3a6d75f9-0ef3-4c99-8d68-7809b17fc607-kube-api-access-2px2k\") pod \"cluster-samples-operator-665b6dd947-vckmv\" (UID: \"3a6d75f9-0ef3-4c99-8d68-7809b17fc607\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vckmv" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.943657 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:55:39 crc kubenswrapper[4580]: I0321 04:55:39.965168 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.001404 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a87434a7-bb73-4552-a502-c1a31119cff7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hcpgb\" (UID: \"a87434a7-bb73-4552-a502-c1a31119cff7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.015859 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.020766 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp55m\" (UniqueName: \"kubernetes.io/projected/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-kube-api-access-vp55m\") pod \"oauth-openshift-558db77b4-kmfk5\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.026037 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.034104 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-p8tbn" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.040944 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzqwz\" (UniqueName: \"kubernetes.io/projected/8a0187cd-b67d-47ee-a791-08939d1b4cc5-kube-api-access-xzqwz\") pod \"cluster-image-registry-operator-dc59b4c8b-cl7vz\" (UID: \"8a0187cd-b67d-47ee-a791-08939d1b4cc5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.042180 4580 request.go:700] Waited for 1.873030655s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver-operator/serviceaccounts/kube-apiserver-operator/token Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.043261 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.060248 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.074996 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca24069e-6f60-4d3b-950e-bf6a87fa1955-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-m4kxh\" (UID: \"ca24069e-6f60-4d3b-950e-bf6a87fa1955\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh" Mar 21 04:55:40 crc kubenswrapper[4580]: W0321 04:55:40.078728 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41dae12a_fc3b_4e2b_a64a_f4f4c791afbc.slice/crio-94390c0805e2e0b871ebf778d73c444b92e875407a43523e23aeda7598b0b9a2 WatchSource:0}: Error finding container 94390c0805e2e0b871ebf778d73c444b92e875407a43523e23aeda7598b0b9a2: Status 404 returned error can't find the container with id 94390c0805e2e0b871ebf778d73c444b92e875407a43523e23aeda7598b0b9a2 Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.088251 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzvkb\" (UniqueName: \"kubernetes.io/projected/69b1f163-8594-47b1-85c7-3330e0d50d8f-kube-api-access-gzvkb\") pod \"downloads-7954f5f757-4ffj8\" (UID: \"69b1f163-8594-47b1-85c7-3330e0d50d8f\") " pod="openshift-console/downloads-7954f5f757-4ffj8" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.092738 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.104402 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nvn7\" (UniqueName: \"kubernetes.io/projected/dd075d6a-7d15-421a-b546-19c5cab789d3-kube-api-access-9nvn7\") pod \"machine-approver-56656f9798-xgchr\" (UID: \"dd075d6a-7d15-421a-b546-19c5cab789d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.107002 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.136976 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.143519 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.158225 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.164205 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.186341 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.224998 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vckmv" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.226471 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.232203 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.259416 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.263360 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9"] Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.270144 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4ffj8" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.271811 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.286353 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.287403 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.296521 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.304359 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.304713 4580 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.325698 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.346524 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.398888 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb"] Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.421332 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/239df9bb-a89d-49c9-b889-8cd32c1db001-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-62hm8\" (UID: \"239df9bb-a89d-49c9-b889-8cd32c1db001\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.421830 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b67fb1a5-405c-4419-baa5-e144d82fb317-trusted-ca\") pod \"ingress-operator-5b745b69d9-t7bxt\" (UID: \"b67fb1a5-405c-4419-baa5-e144d82fb317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.421923 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7p86\" (UniqueName: \"kubernetes.io/projected/119cdbf4-8721-4241-80c1-a85b8df6ce52-kube-api-access-v7p86\") pod \"migrator-59844c95c7-br8h5\" (UID: \"119cdbf4-8721-4241-80c1-a85b8df6ce52\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-br8h5" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422073 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4msg\" (UniqueName: \"kubernetes.io/projected/6aa8e3c1-9f91-43c2-9eac-6c146199916e-kube-api-access-x4msg\") pod \"machine-config-operator-74547568cd-q8th9\" (UID: \"6aa8e3c1-9f91-43c2-9eac-6c146199916e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422102 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4pwx\" (UniqueName: \"kubernetes.io/projected/bc866c38-7a32-4dab-a741-204309afddc5-kube-api-access-j4pwx\") pod \"kube-storage-version-migrator-operator-b67b599dd-bnhsl\" (UID: \"bc866c38-7a32-4dab-a741-204309afddc5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422156 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8snfv\" (UniqueName: \"kubernetes.io/projected/26340f0a-c057-4d3a-aac0-45e31a795929-kube-api-access-8snfv\") pod \"machine-config-controller-84d6567774-jp84t\" (UID: \"26340f0a-c057-4d3a-aac0-45e31a795929\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422201 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6aa8e3c1-9f91-43c2-9eac-6c146199916e-proxy-tls\") pod \"machine-config-operator-74547568cd-q8th9\" (UID: \"6aa8e3c1-9f91-43c2-9eac-6c146199916e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422216 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85kgm\" (UniqueName: \"kubernetes.io/projected/b67fb1a5-405c-4419-baa5-e144d82fb317-kube-api-access-85kgm\") pod \"ingress-operator-5b745b69d9-t7bxt\" (UID: \"b67fb1a5-405c-4419-baa5-e144d82fb317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422232 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/621054bf-a821-4811-b7b6-5b7d011b8a05-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422260 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz8w7\" (UniqueName: \"kubernetes.io/projected/48bae9ea-47af-484e-a8c4-b6c3e49438e5-kube-api-access-jz8w7\") pod \"control-plane-machine-set-operator-78cbb6b69f-jjvsm\" (UID: \"48bae9ea-47af-484e-a8c4-b6c3e49438e5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jjvsm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422275 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/239df9bb-a89d-49c9-b889-8cd32c1db001-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-62hm8\" (UID: \"239df9bb-a89d-49c9-b889-8cd32c1db001\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422321 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc866c38-7a32-4dab-a741-204309afddc5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bnhsl\" (UID: \"bc866c38-7a32-4dab-a741-204309afddc5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422338 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/239df9bb-a89d-49c9-b889-8cd32c1db001-config\") pod \"kube-controller-manager-operator-78b949d7b-62hm8\" (UID: \"239df9bb-a89d-49c9-b889-8cd32c1db001\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422404 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6aa8e3c1-9f91-43c2-9eac-6c146199916e-images\") pod \"machine-config-operator-74547568cd-q8th9\" (UID: \"6aa8e3c1-9f91-43c2-9eac-6c146199916e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422418 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b67fb1a5-405c-4419-baa5-e144d82fb317-metrics-tls\") pod \"ingress-operator-5b745b69d9-t7bxt\" (UID: \"b67fb1a5-405c-4419-baa5-e144d82fb317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422442 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc866c38-7a32-4dab-a741-204309afddc5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bnhsl\" (UID: \"bc866c38-7a32-4dab-a741-204309afddc5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422515 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-registry-tls\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422591 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b67fb1a5-405c-4419-baa5-e144d82fb317-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t7bxt\" (UID: \"b67fb1a5-405c-4419-baa5-e144d82fb317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422641 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422677 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26340f0a-c057-4d3a-aac0-45e31a795929-proxy-tls\") pod \"machine-config-controller-84d6567774-jp84t\" (UID: \"26340f0a-c057-4d3a-aac0-45e31a795929\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422693 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/48bae9ea-47af-484e-a8c4-b6c3e49438e5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jjvsm\" (UID: \"48bae9ea-47af-484e-a8c4-b6c3e49438e5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jjvsm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422711 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/26340f0a-c057-4d3a-aac0-45e31a795929-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jp84t\" (UID: \"26340f0a-c057-4d3a-aac0-45e31a795929\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422809 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/621054bf-a821-4811-b7b6-5b7d011b8a05-registry-certificates\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422855 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/621054bf-a821-4811-b7b6-5b7d011b8a05-trusted-ca\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422900 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6aa8e3c1-9f91-43c2-9eac-6c146199916e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q8th9\" (UID: \"6aa8e3c1-9f91-43c2-9eac-6c146199916e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422920 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nvfz\" (UniqueName: \"kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-kube-api-access-8nvfz\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422951 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/621054bf-a821-4811-b7b6-5b7d011b8a05-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.422978 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-bound-sa-token\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: E0321 04:55:40.436077 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:40.935756056 +0000 UTC m=+246.018339884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.493239 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p8tbn"] Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.500201 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vhp42"] Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.527922 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.528162 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8snfv\" (UniqueName: \"kubernetes.io/projected/26340f0a-c057-4d3a-aac0-45e31a795929-kube-api-access-8snfv\") pod \"machine-config-controller-84d6567774-jp84t\" (UID: \"26340f0a-c057-4d3a-aac0-45e31a795929\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t" Mar 21 04:55:40 crc kubenswrapper[4580]: E0321 04:55:40.528209 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:41.028185562 +0000 UTC m=+246.110769190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.528238 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/973f371e-32fd-4c04-a76c-2a8e8cc54f5d-profile-collector-cert\") pod \"catalog-operator-68c6474976-46znb\" (UID: \"973f371e-32fd-4c04-a76c-2a8e8cc54f5d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.528267 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8d1b089c-8016-458b-83b5-84f602ea0ba7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hnl8s\" (UID: \"8d1b089c-8016-458b-83b5-84f602ea0ba7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.528299 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85kgm\" (UniqueName: \"kubernetes.io/projected/b67fb1a5-405c-4419-baa5-e144d82fb317-kube-api-access-85kgm\") pod \"ingress-operator-5b745b69d9-t7bxt\" (UID: \"b67fb1a5-405c-4419-baa5-e144d82fb317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.528325 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6aa8e3c1-9f91-43c2-9eac-6c146199916e-proxy-tls\") pod \"machine-config-operator-74547568cd-q8th9\" (UID: \"6aa8e3c1-9f91-43c2-9eac-6c146199916e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.528348 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/621054bf-a821-4811-b7b6-5b7d011b8a05-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.528370 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d9cce850-4a50-4a52-ac9b-147fcbde086a-srv-cert\") pod \"olm-operator-6b444d44fb-xgkxr\" (UID: \"d9cce850-4a50-4a52-ac9b-147fcbde086a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.529329 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/621054bf-a821-4811-b7b6-5b7d011b8a05-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.530964 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz8w7\" (UniqueName: \"kubernetes.io/projected/48bae9ea-47af-484e-a8c4-b6c3e49438e5-kube-api-access-jz8w7\") pod \"control-plane-machine-set-operator-78cbb6b69f-jjvsm\" (UID: \"48bae9ea-47af-484e-a8c4-b6c3e49438e5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jjvsm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.531041 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/239df9bb-a89d-49c9-b889-8cd32c1db001-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-62hm8\" (UID: \"239df9bb-a89d-49c9-b889-8cd32c1db001\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.531086 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc866c38-7a32-4dab-a741-204309afddc5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bnhsl\" (UID: \"bc866c38-7a32-4dab-a741-204309afddc5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.531117 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-csi-data-dir\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.531148 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e86baa4-7923-4d5e-bb0f-67085a562d68-config\") pod \"service-ca-operator-777779d784-fzl5v\" (UID: \"1e86baa4-7923-4d5e-bb0f-67085a562d68\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.531179 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/239df9bb-a89d-49c9-b889-8cd32c1db001-config\") pod \"kube-controller-manager-operator-78b949d7b-62hm8\" (UID: \"239df9bb-a89d-49c9-b889-8cd32c1db001\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.531199 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92b13d13-f88e-47cc-8815-34b54fd68711-secret-volume\") pod \"collect-profiles-29567805-jljjq\" (UID: \"92b13d13-f88e-47cc-8815-34b54fd68711\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.531237 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6aa8e3c1-9f91-43c2-9eac-6c146199916e-images\") pod \"machine-config-operator-74547568cd-q8th9\" (UID: \"6aa8e3c1-9f91-43c2-9eac-6c146199916e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.531270 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b67fb1a5-405c-4419-baa5-e144d82fb317-metrics-tls\") pod \"ingress-operator-5b745b69d9-t7bxt\" (UID: \"b67fb1a5-405c-4419-baa5-e144d82fb317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.531294 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc866c38-7a32-4dab-a741-204309afddc5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bnhsl\" (UID: \"bc866c38-7a32-4dab-a741-204309afddc5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.531315 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-plugins-dir\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.531346 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-socket-dir\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.531390 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-registry-tls\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.531471 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b67fb1a5-405c-4419-baa5-e144d82fb317-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t7bxt\" (UID: \"b67fb1a5-405c-4419-baa5-e144d82fb317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.531507 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.532835 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6aa8e3c1-9f91-43c2-9eac-6c146199916e-images\") pod \"machine-config-operator-74547568cd-q8th9\" (UID: \"6aa8e3c1-9f91-43c2-9eac-6c146199916e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.534219 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc866c38-7a32-4dab-a741-204309afddc5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bnhsl\" (UID: \"bc866c38-7a32-4dab-a741-204309afddc5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.534870 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/239df9bb-a89d-49c9-b889-8cd32c1db001-config\") pod \"kube-controller-manager-operator-78b949d7b-62hm8\" (UID: \"239df9bb-a89d-49c9-b889-8cd32c1db001\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.534960 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stl52\" (UniqueName: \"kubernetes.io/projected/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-kube-api-access-stl52\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.534997 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26340f0a-c057-4d3a-aac0-45e31a795929-proxy-tls\") pod \"machine-config-controller-84d6567774-jp84t\" (UID: \"26340f0a-c057-4d3a-aac0-45e31a795929\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535024 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/48bae9ea-47af-484e-a8c4-b6c3e49438e5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jjvsm\" (UID: \"48bae9ea-47af-484e-a8c4-b6c3e49438e5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jjvsm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535066 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/26340f0a-c057-4d3a-aac0-45e31a795929-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jp84t\" (UID: \"26340f0a-c057-4d3a-aac0-45e31a795929\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535088 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92b13d13-f88e-47cc-8815-34b54fd68711-config-volume\") pod \"collect-profiles-29567805-jljjq\" (UID: \"92b13d13-f88e-47cc-8815-34b54fd68711\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535202 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7spcf\" (UniqueName: \"kubernetes.io/projected/96a4ee3e-2a3f-44b8-8ab5-c4cd0b8681f6-kube-api-access-7spcf\") pod \"multus-admission-controller-857f4d67dd-jnpfl\" (UID: \"96a4ee3e-2a3f-44b8-8ab5-c4cd0b8681f6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jnpfl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535244 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v5vg\" (UniqueName: \"kubernetes.io/projected/1e86baa4-7923-4d5e-bb0f-67085a562d68-kube-api-access-2v5vg\") pod \"service-ca-operator-777779d784-fzl5v\" (UID: \"1e86baa4-7923-4d5e-bb0f-67085a562d68\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535283 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7njkm\" (UniqueName: \"kubernetes.io/projected/1714688f-61d5-436b-baaf-2668757942fd-kube-api-access-7njkm\") pod \"auto-csr-approver-29567814-8cxbg\" (UID: \"1714688f-61d5-436b-baaf-2668757942fd\") " pod="openshift-infra/auto-csr-approver-29567814-8cxbg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535320 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/621054bf-a821-4811-b7b6-5b7d011b8a05-registry-certificates\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535350 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/621054bf-a821-4811-b7b6-5b7d011b8a05-trusted-ca\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535375 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tltsc\" (UniqueName: \"kubernetes.io/projected/10f921f7-e891-49fd-825b-37843ebc2f29-kube-api-access-tltsc\") pod \"dns-default-9csml\" (UID: \"10f921f7-e891-49fd-825b-37843ebc2f29\") " pod="openshift-dns/dns-default-9csml" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535409 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cwjc\" (UniqueName: \"kubernetes.io/projected/973f371e-32fd-4c04-a76c-2a8e8cc54f5d-kube-api-access-5cwjc\") pod \"catalog-operator-68c6474976-46znb\" (UID: \"973f371e-32fd-4c04-a76c-2a8e8cc54f5d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535436 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nvfz\" (UniqueName: \"kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-kube-api-access-8nvfz\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535460 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6aa8e3c1-9f91-43c2-9eac-6c146199916e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q8th9\" (UID: \"6aa8e3c1-9f91-43c2-9eac-6c146199916e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535485 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10f921f7-e891-49fd-825b-37843ebc2f29-config-volume\") pod \"dns-default-9csml\" (UID: \"10f921f7-e891-49fd-825b-37843ebc2f29\") " pod="openshift-dns/dns-default-9csml" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535524 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/621054bf-a821-4811-b7b6-5b7d011b8a05-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535547 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm27b\" (UniqueName: \"kubernetes.io/projected/84d4caa0-b9cd-40b1-ab0f-0903523cfbce-kube-api-access-fm27b\") pod \"machine-config-server-xd8hf\" (UID: \"84d4caa0-b9cd-40b1-ab0f-0903523cfbce\") " pod="openshift-machine-config-operator/machine-config-server-xd8hf" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535574 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd4caa55-55f9-42e7-94d2-0069efa2a4ae-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kzsgp\" (UID: \"cd4caa55-55f9-42e7-94d2-0069efa2a4ae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535606 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nlqt\" (UniqueName: \"kubernetes.io/projected/92b13d13-f88e-47cc-8815-34b54fd68711-kube-api-access-8nlqt\") pod \"collect-profiles-29567805-jljjq\" (UID: \"92b13d13-f88e-47cc-8815-34b54fd68711\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535635 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vsqs\" (UniqueName: \"kubernetes.io/projected/377f7953-d6ed-4d67-a92c-07da9f5075d3-kube-api-access-2vsqs\") pod \"service-ca-9c57cc56f-gltvw\" (UID: \"377f7953-d6ed-4d67-a92c-07da9f5075d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-gltvw" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535658 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-bound-sa-token\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535682 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/96a4ee3e-2a3f-44b8-8ab5-c4cd0b8681f6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jnpfl\" (UID: \"96a4ee3e-2a3f-44b8-8ab5-c4cd0b8681f6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jnpfl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.535709 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d9cce850-4a50-4a52-ac9b-147fcbde086a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xgkxr\" (UID: \"d9cce850-4a50-4a52-ac9b-147fcbde086a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536056 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/239df9bb-a89d-49c9-b889-8cd32c1db001-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-62hm8\" (UID: \"239df9bb-a89d-49c9-b889-8cd32c1db001\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536084 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b67fb1a5-405c-4419-baa5-e144d82fb317-trusted-ca\") pod \"ingress-operator-5b745b69d9-t7bxt\" (UID: \"b67fb1a5-405c-4419-baa5-e144d82fb317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536108 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/377f7953-d6ed-4d67-a92c-07da9f5075d3-signing-key\") pod \"service-ca-9c57cc56f-gltvw\" (UID: \"377f7953-d6ed-4d67-a92c-07da9f5075d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-gltvw" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536131 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/377f7953-d6ed-4d67-a92c-07da9f5075d3-signing-cabundle\") pod \"service-ca-9c57cc56f-gltvw\" (UID: \"377f7953-d6ed-4d67-a92c-07da9f5075d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-gltvw" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536158 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/84d4caa0-b9cd-40b1-ab0f-0903523cfbce-certs\") pod \"machine-config-server-xd8hf\" (UID: \"84d4caa0-b9cd-40b1-ab0f-0903523cfbce\") " pod="openshift-machine-config-operator/machine-config-server-xd8hf" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536181 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d1b089c-8016-458b-83b5-84f602ea0ba7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hnl8s\" (UID: \"8d1b089c-8016-458b-83b5-84f602ea0ba7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536200 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e86baa4-7923-4d5e-bb0f-67085a562d68-serving-cert\") pod \"service-ca-operator-777779d784-fzl5v\" (UID: \"1e86baa4-7923-4d5e-bb0f-67085a562d68\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536225 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5185e797-2aa2-4012-96e0-0afb8c92a09e-tmpfs\") pod \"packageserver-d55dfcdfc-rsrfm\" (UID: \"5185e797-2aa2-4012-96e0-0afb8c92a09e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536259 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10f921f7-e891-49fd-825b-37843ebc2f29-metrics-tls\") pod \"dns-default-9csml\" (UID: \"10f921f7-e891-49fd-825b-37843ebc2f29\") " pod="openshift-dns/dns-default-9csml" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536277 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsmvm\" (UniqueName: \"kubernetes.io/projected/5185e797-2aa2-4012-96e0-0afb8c92a09e-kube-api-access-tsmvm\") pod \"packageserver-d55dfcdfc-rsrfm\" (UID: \"5185e797-2aa2-4012-96e0-0afb8c92a09e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536298 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zk7t\" (UniqueName: \"kubernetes.io/projected/cd4caa55-55f9-42e7-94d2-0069efa2a4ae-kube-api-access-9zk7t\") pod \"package-server-manager-789f6589d5-kzsgp\" (UID: \"cd4caa55-55f9-42e7-94d2-0069efa2a4ae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536329 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7p86\" (UniqueName: \"kubernetes.io/projected/119cdbf4-8721-4241-80c1-a85b8df6ce52-kube-api-access-v7p86\") pod \"migrator-59844c95c7-br8h5\" (UID: \"119cdbf4-8721-4241-80c1-a85b8df6ce52\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-br8h5" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536354 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5185e797-2aa2-4012-96e0-0afb8c92a09e-webhook-cert\") pod \"packageserver-d55dfcdfc-rsrfm\" (UID: \"5185e797-2aa2-4012-96e0-0afb8c92a09e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536386 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/973f371e-32fd-4c04-a76c-2a8e8cc54f5d-srv-cert\") pod \"catalog-operator-68c6474976-46znb\" (UID: \"973f371e-32fd-4c04-a76c-2a8e8cc54f5d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536407 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-registration-dir\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536424 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/84d4caa0-b9cd-40b1-ab0f-0903523cfbce-node-bootstrap-token\") pod \"machine-config-server-xd8hf\" (UID: \"84d4caa0-b9cd-40b1-ab0f-0903523cfbce\") " pod="openshift-machine-config-operator/machine-config-server-xd8hf" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536443 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf3a3e39-aab5-499e-a39a-0b74c9d7d666-cert\") pod \"ingress-canary-rsrjh\" (UID: \"bf3a3e39-aab5-499e-a39a-0b74c9d7d666\") " pod="openshift-ingress-canary/ingress-canary-rsrjh" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536482 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5185e797-2aa2-4012-96e0-0afb8c92a09e-apiservice-cert\") pod \"packageserver-d55dfcdfc-rsrfm\" (UID: \"5185e797-2aa2-4012-96e0-0afb8c92a09e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536510 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzv59\" (UniqueName: \"kubernetes.io/projected/8d1b089c-8016-458b-83b5-84f602ea0ba7-kube-api-access-vzv59\") pod \"marketplace-operator-79b997595-hnl8s\" (UID: \"8d1b089c-8016-458b-83b5-84f602ea0ba7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536544 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4msg\" (UniqueName: \"kubernetes.io/projected/6aa8e3c1-9f91-43c2-9eac-6c146199916e-kube-api-access-x4msg\") pod \"machine-config-operator-74547568cd-q8th9\" (UID: \"6aa8e3c1-9f91-43c2-9eac-6c146199916e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536570 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4pwx\" (UniqueName: \"kubernetes.io/projected/bc866c38-7a32-4dab-a741-204309afddc5-kube-api-access-j4pwx\") pod \"kube-storage-version-migrator-operator-b67b599dd-bnhsl\" (UID: \"bc866c38-7a32-4dab-a741-204309afddc5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536592 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69895\" (UniqueName: \"kubernetes.io/projected/d9cce850-4a50-4a52-ac9b-147fcbde086a-kube-api-access-69895\") pod \"olm-operator-6b444d44fb-xgkxr\" (UID: \"d9cce850-4a50-4a52-ac9b-147fcbde086a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536615 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-mountpoint-dir\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.536637 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb8q2\" (UniqueName: \"kubernetes.io/projected/bf3a3e39-aab5-499e-a39a-0b74c9d7d666-kube-api-access-rb8q2\") pod \"ingress-canary-rsrjh\" (UID: \"bf3a3e39-aab5-499e-a39a-0b74c9d7d666\") " pod="openshift-ingress-canary/ingress-canary-rsrjh" Mar 21 04:55:40 crc kubenswrapper[4580]: E0321 04:55:40.537025 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:41.037011702 +0000 UTC m=+246.119595540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.544072 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b67fb1a5-405c-4419-baa5-e144d82fb317-trusted-ca\") pod \"ingress-operator-5b745b69d9-t7bxt\" (UID: \"b67fb1a5-405c-4419-baa5-e144d82fb317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.544082 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b67fb1a5-405c-4419-baa5-e144d82fb317-metrics-tls\") pod \"ingress-operator-5b745b69d9-t7bxt\" (UID: \"b67fb1a5-405c-4419-baa5-e144d82fb317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.545202 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/26340f0a-c057-4d3a-aac0-45e31a795929-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jp84t\" (UID: \"26340f0a-c057-4d3a-aac0-45e31a795929\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.545224 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6aa8e3c1-9f91-43c2-9eac-6c146199916e-proxy-tls\") pod \"machine-config-operator-74547568cd-q8th9\" (UID: \"6aa8e3c1-9f91-43c2-9eac-6c146199916e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.545950 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6aa8e3c1-9f91-43c2-9eac-6c146199916e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q8th9\" (UID: \"6aa8e3c1-9f91-43c2-9eac-6c146199916e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.548512 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc866c38-7a32-4dab-a741-204309afddc5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bnhsl\" (UID: \"bc866c38-7a32-4dab-a741-204309afddc5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.549153 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/621054bf-a821-4811-b7b6-5b7d011b8a05-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.552056 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/621054bf-a821-4811-b7b6-5b7d011b8a05-trusted-ca\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.552471 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/621054bf-a821-4811-b7b6-5b7d011b8a05-registry-certificates\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.557805 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-registry-tls\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.558162 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26340f0a-c057-4d3a-aac0-45e31a795929-proxy-tls\") pod \"machine-config-controller-84d6567774-jp84t\" (UID: \"26340f0a-c057-4d3a-aac0-45e31a795929\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.560014 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/48bae9ea-47af-484e-a8c4-b6c3e49438e5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jjvsm\" (UID: \"48bae9ea-47af-484e-a8c4-b6c3e49438e5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jjvsm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.570396 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/239df9bb-a89d-49c9-b889-8cd32c1db001-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-62hm8\" (UID: \"239df9bb-a89d-49c9-b889-8cd32c1db001\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.586624 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8snfv\" (UniqueName: \"kubernetes.io/projected/26340f0a-c057-4d3a-aac0-45e31a795929-kube-api-access-8snfv\") pod \"machine-config-controller-84d6567774-jp84t\" (UID: \"26340f0a-c057-4d3a-aac0-45e31a795929\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.593227 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2rbc4"] Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.614990 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85kgm\" (UniqueName: \"kubernetes.io/projected/b67fb1a5-405c-4419-baa5-e144d82fb317-kube-api-access-85kgm\") pod \"ingress-operator-5b745b69d9-t7bxt\" (UID: \"b67fb1a5-405c-4419-baa5-e144d82fb317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.627685 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz8w7\" (UniqueName: \"kubernetes.io/projected/48bae9ea-47af-484e-a8c4-b6c3e49438e5-kube-api-access-jz8w7\") pod \"control-plane-machine-set-operator-78cbb6b69f-jjvsm\" (UID: \"48bae9ea-47af-484e-a8c4-b6c3e49438e5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jjvsm" Mar 21 04:55:40 crc kubenswrapper[4580]: W0321 04:55:40.637004 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ec0221a_e2f4_4bc9_b6d3_5faf99da51aa.slice/crio-d7e2b0b7c75854afdeb2d40a97b77d3f238c4d80db84e41b1a945a43e62eab17 WatchSource:0}: Error finding container d7e2b0b7c75854afdeb2d40a97b77d3f238c4d80db84e41b1a945a43e62eab17: Status 404 returned error can't find the container with id d7e2b0b7c75854afdeb2d40a97b77d3f238c4d80db84e41b1a945a43e62eab17 Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.637401 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.637601 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzv59\" (UniqueName: \"kubernetes.io/projected/8d1b089c-8016-458b-83b5-84f602ea0ba7-kube-api-access-vzv59\") pod \"marketplace-operator-79b997595-hnl8s\" (UID: \"8d1b089c-8016-458b-83b5-84f602ea0ba7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.637652 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69895\" (UniqueName: \"kubernetes.io/projected/d9cce850-4a50-4a52-ac9b-147fcbde086a-kube-api-access-69895\") pod \"olm-operator-6b444d44fb-xgkxr\" (UID: \"d9cce850-4a50-4a52-ac9b-147fcbde086a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.637678 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-mountpoint-dir\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.637697 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb8q2\" (UniqueName: \"kubernetes.io/projected/bf3a3e39-aab5-499e-a39a-0b74c9d7d666-kube-api-access-rb8q2\") pod \"ingress-canary-rsrjh\" (UID: \"bf3a3e39-aab5-499e-a39a-0b74c9d7d666\") " pod="openshift-ingress-canary/ingress-canary-rsrjh" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.637725 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/973f371e-32fd-4c04-a76c-2a8e8cc54f5d-profile-collector-cert\") pod \"catalog-operator-68c6474976-46znb\" (UID: \"973f371e-32fd-4c04-a76c-2a8e8cc54f5d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.637749 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8d1b089c-8016-458b-83b5-84f602ea0ba7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hnl8s\" (UID: \"8d1b089c-8016-458b-83b5-84f602ea0ba7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.637775 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d9cce850-4a50-4a52-ac9b-147fcbde086a-srv-cert\") pod \"olm-operator-6b444d44fb-xgkxr\" (UID: \"d9cce850-4a50-4a52-ac9b-147fcbde086a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.637834 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-csi-data-dir\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.637863 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e86baa4-7923-4d5e-bb0f-67085a562d68-config\") pod \"service-ca-operator-777779d784-fzl5v\" (UID: \"1e86baa4-7923-4d5e-bb0f-67085a562d68\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.637891 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92b13d13-f88e-47cc-8815-34b54fd68711-secret-volume\") pod \"collect-profiles-29567805-jljjq\" (UID: \"92b13d13-f88e-47cc-8815-34b54fd68711\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.637920 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-plugins-dir\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.637941 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-socket-dir\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638015 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stl52\" (UniqueName: \"kubernetes.io/projected/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-kube-api-access-stl52\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638044 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92b13d13-f88e-47cc-8815-34b54fd68711-config-volume\") pod \"collect-profiles-29567805-jljjq\" (UID: \"92b13d13-f88e-47cc-8815-34b54fd68711\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638070 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7spcf\" (UniqueName: \"kubernetes.io/projected/96a4ee3e-2a3f-44b8-8ab5-c4cd0b8681f6-kube-api-access-7spcf\") pod \"multus-admission-controller-857f4d67dd-jnpfl\" (UID: \"96a4ee3e-2a3f-44b8-8ab5-c4cd0b8681f6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jnpfl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638092 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v5vg\" (UniqueName: \"kubernetes.io/projected/1e86baa4-7923-4d5e-bb0f-67085a562d68-kube-api-access-2v5vg\") pod \"service-ca-operator-777779d784-fzl5v\" (UID: \"1e86baa4-7923-4d5e-bb0f-67085a562d68\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638112 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7njkm\" (UniqueName: \"kubernetes.io/projected/1714688f-61d5-436b-baaf-2668757942fd-kube-api-access-7njkm\") pod \"auto-csr-approver-29567814-8cxbg\" (UID: \"1714688f-61d5-436b-baaf-2668757942fd\") " pod="openshift-infra/auto-csr-approver-29567814-8cxbg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638136 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tltsc\" (UniqueName: \"kubernetes.io/projected/10f921f7-e891-49fd-825b-37843ebc2f29-kube-api-access-tltsc\") pod \"dns-default-9csml\" (UID: \"10f921f7-e891-49fd-825b-37843ebc2f29\") " pod="openshift-dns/dns-default-9csml" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638160 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cwjc\" (UniqueName: \"kubernetes.io/projected/973f371e-32fd-4c04-a76c-2a8e8cc54f5d-kube-api-access-5cwjc\") pod \"catalog-operator-68c6474976-46znb\" (UID: \"973f371e-32fd-4c04-a76c-2a8e8cc54f5d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638190 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10f921f7-e891-49fd-825b-37843ebc2f29-config-volume\") pod \"dns-default-9csml\" (UID: \"10f921f7-e891-49fd-825b-37843ebc2f29\") " pod="openshift-dns/dns-default-9csml" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638223 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm27b\" (UniqueName: \"kubernetes.io/projected/84d4caa0-b9cd-40b1-ab0f-0903523cfbce-kube-api-access-fm27b\") pod \"machine-config-server-xd8hf\" (UID: \"84d4caa0-b9cd-40b1-ab0f-0903523cfbce\") " pod="openshift-machine-config-operator/machine-config-server-xd8hf" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638248 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd4caa55-55f9-42e7-94d2-0069efa2a4ae-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kzsgp\" (UID: \"cd4caa55-55f9-42e7-94d2-0069efa2a4ae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638271 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nlqt\" (UniqueName: \"kubernetes.io/projected/92b13d13-f88e-47cc-8815-34b54fd68711-kube-api-access-8nlqt\") pod \"collect-profiles-29567805-jljjq\" (UID: \"92b13d13-f88e-47cc-8815-34b54fd68711\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638296 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vsqs\" (UniqueName: \"kubernetes.io/projected/377f7953-d6ed-4d67-a92c-07da9f5075d3-kube-api-access-2vsqs\") pod \"service-ca-9c57cc56f-gltvw\" (UID: \"377f7953-d6ed-4d67-a92c-07da9f5075d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-gltvw" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638325 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/96a4ee3e-2a3f-44b8-8ab5-c4cd0b8681f6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jnpfl\" (UID: \"96a4ee3e-2a3f-44b8-8ab5-c4cd0b8681f6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jnpfl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638348 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d9cce850-4a50-4a52-ac9b-147fcbde086a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xgkxr\" (UID: \"d9cce850-4a50-4a52-ac9b-147fcbde086a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638374 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/377f7953-d6ed-4d67-a92c-07da9f5075d3-signing-key\") pod \"service-ca-9c57cc56f-gltvw\" (UID: \"377f7953-d6ed-4d67-a92c-07da9f5075d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-gltvw" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638392 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/377f7953-d6ed-4d67-a92c-07da9f5075d3-signing-cabundle\") pod \"service-ca-9c57cc56f-gltvw\" (UID: \"377f7953-d6ed-4d67-a92c-07da9f5075d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-gltvw" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638408 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/84d4caa0-b9cd-40b1-ab0f-0903523cfbce-certs\") pod \"machine-config-server-xd8hf\" (UID: \"84d4caa0-b9cd-40b1-ab0f-0903523cfbce\") " pod="openshift-machine-config-operator/machine-config-server-xd8hf" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638429 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e86baa4-7923-4d5e-bb0f-67085a562d68-serving-cert\") pod \"service-ca-operator-777779d784-fzl5v\" (UID: \"1e86baa4-7923-4d5e-bb0f-67085a562d68\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638452 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d1b089c-8016-458b-83b5-84f602ea0ba7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hnl8s\" (UID: \"8d1b089c-8016-458b-83b5-84f602ea0ba7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638471 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5185e797-2aa2-4012-96e0-0afb8c92a09e-tmpfs\") pod \"packageserver-d55dfcdfc-rsrfm\" (UID: \"5185e797-2aa2-4012-96e0-0afb8c92a09e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638492 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10f921f7-e891-49fd-825b-37843ebc2f29-metrics-tls\") pod \"dns-default-9csml\" (UID: \"10f921f7-e891-49fd-825b-37843ebc2f29\") " pod="openshift-dns/dns-default-9csml" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638514 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsmvm\" (UniqueName: \"kubernetes.io/projected/5185e797-2aa2-4012-96e0-0afb8c92a09e-kube-api-access-tsmvm\") pod \"packageserver-d55dfcdfc-rsrfm\" (UID: \"5185e797-2aa2-4012-96e0-0afb8c92a09e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638538 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zk7t\" (UniqueName: \"kubernetes.io/projected/cd4caa55-55f9-42e7-94d2-0069efa2a4ae-kube-api-access-9zk7t\") pod \"package-server-manager-789f6589d5-kzsgp\" (UID: \"cd4caa55-55f9-42e7-94d2-0069efa2a4ae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638569 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5185e797-2aa2-4012-96e0-0afb8c92a09e-webhook-cert\") pod \"packageserver-d55dfcdfc-rsrfm\" (UID: \"5185e797-2aa2-4012-96e0-0afb8c92a09e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638600 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/973f371e-32fd-4c04-a76c-2a8e8cc54f5d-srv-cert\") pod \"catalog-operator-68c6474976-46znb\" (UID: \"973f371e-32fd-4c04-a76c-2a8e8cc54f5d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638621 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-registration-dir\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638646 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/84d4caa0-b9cd-40b1-ab0f-0903523cfbce-node-bootstrap-token\") pod \"machine-config-server-xd8hf\" (UID: \"84d4caa0-b9cd-40b1-ab0f-0903523cfbce\") " pod="openshift-machine-config-operator/machine-config-server-xd8hf" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638666 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf3a3e39-aab5-499e-a39a-0b74c9d7d666-cert\") pod \"ingress-canary-rsrjh\" (UID: \"bf3a3e39-aab5-499e-a39a-0b74c9d7d666\") " pod="openshift-ingress-canary/ingress-canary-rsrjh" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.638695 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5185e797-2aa2-4012-96e0-0afb8c92a09e-apiservice-cert\") pod \"packageserver-d55dfcdfc-rsrfm\" (UID: \"5185e797-2aa2-4012-96e0-0afb8c92a09e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.639617 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10f921f7-e891-49fd-825b-37843ebc2f29-config-volume\") pod \"dns-default-9csml\" (UID: \"10f921f7-e891-49fd-825b-37843ebc2f29\") " pod="openshift-dns/dns-default-9csml" Mar 21 04:55:40 crc kubenswrapper[4580]: E0321 04:55:40.639705 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:41.139687114 +0000 UTC m=+246.222270742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.639905 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-mountpoint-dir\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.640473 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-socket-dir\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.646200 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92b13d13-f88e-47cc-8815-34b54fd68711-config-volume\") pod \"collect-profiles-29567805-jljjq\" (UID: \"92b13d13-f88e-47cc-8815-34b54fd68711\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.651584 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-plugins-dir\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.652907 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5185e797-2aa2-4012-96e0-0afb8c92a09e-tmpfs\") pod \"packageserver-d55dfcdfc-rsrfm\" (UID: \"5185e797-2aa2-4012-96e0-0afb8c92a09e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.655989 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-csi-data-dir\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.656739 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e86baa4-7923-4d5e-bb0f-67085a562d68-config\") pod \"service-ca-operator-777779d784-fzl5v\" (UID: \"1e86baa4-7923-4d5e-bb0f-67085a562d68\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.657399 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/96a4ee3e-2a3f-44b8-8ab5-c4cd0b8681f6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jnpfl\" (UID: \"96a4ee3e-2a3f-44b8-8ab5-c4cd0b8681f6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jnpfl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.659100 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-registration-dir\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.660107 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/377f7953-d6ed-4d67-a92c-07da9f5075d3-signing-cabundle\") pod \"service-ca-9c57cc56f-gltvw\" (UID: \"377f7953-d6ed-4d67-a92c-07da9f5075d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-gltvw" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.660560 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8d1b089c-8016-458b-83b5-84f602ea0ba7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hnl8s\" (UID: \"8d1b089c-8016-458b-83b5-84f602ea0ba7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.661403 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e86baa4-7923-4d5e-bb0f-67085a562d68-serving-cert\") pod \"service-ca-operator-777779d784-fzl5v\" (UID: \"1e86baa4-7923-4d5e-bb0f-67085a562d68\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.662747 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/84d4caa0-b9cd-40b1-ab0f-0903523cfbce-certs\") pod \"machine-config-server-xd8hf\" (UID: \"84d4caa0-b9cd-40b1-ab0f-0903523cfbce\") " pod="openshift-machine-config-operator/machine-config-server-xd8hf" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.663360 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5185e797-2aa2-4012-96e0-0afb8c92a09e-apiservice-cert\") pod \"packageserver-d55dfcdfc-rsrfm\" (UID: \"5185e797-2aa2-4012-96e0-0afb8c92a09e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.664160 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d9cce850-4a50-4a52-ac9b-147fcbde086a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xgkxr\" (UID: \"d9cce850-4a50-4a52-ac9b-147fcbde086a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.665908 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92b13d13-f88e-47cc-8815-34b54fd68711-secret-volume\") pod \"collect-profiles-29567805-jljjq\" (UID: \"92b13d13-f88e-47cc-8815-34b54fd68711\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.668052 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd4caa55-55f9-42e7-94d2-0069efa2a4ae-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kzsgp\" (UID: \"cd4caa55-55f9-42e7-94d2-0069efa2a4ae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.668424 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/377f7953-d6ed-4d67-a92c-07da9f5075d3-signing-key\") pod \"service-ca-9c57cc56f-gltvw\" (UID: \"377f7953-d6ed-4d67-a92c-07da9f5075d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-gltvw" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.669321 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10f921f7-e891-49fd-825b-37843ebc2f29-metrics-tls\") pod \"dns-default-9csml\" (UID: \"10f921f7-e891-49fd-825b-37843ebc2f29\") " pod="openshift-dns/dns-default-9csml" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.671587 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/239df9bb-a89d-49c9-b889-8cd32c1db001-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-62hm8\" (UID: \"239df9bb-a89d-49c9-b889-8cd32c1db001\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.674755 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b67fb1a5-405c-4419-baa5-e144d82fb317-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t7bxt\" (UID: \"b67fb1a5-405c-4419-baa5-e144d82fb317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.681467 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf3a3e39-aab5-499e-a39a-0b74c9d7d666-cert\") pod \"ingress-canary-rsrjh\" (UID: \"bf3a3e39-aab5-499e-a39a-0b74c9d7d666\") " pod="openshift-ingress-canary/ingress-canary-rsrjh" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.683195 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5185e797-2aa2-4012-96e0-0afb8c92a09e-webhook-cert\") pod \"packageserver-d55dfcdfc-rsrfm\" (UID: \"5185e797-2aa2-4012-96e0-0afb8c92a09e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.685181 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jjvsm" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.686332 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/973f371e-32fd-4c04-a76c-2a8e8cc54f5d-profile-collector-cert\") pod \"catalog-operator-68c6474976-46znb\" (UID: \"973f371e-32fd-4c04-a76c-2a8e8cc54f5d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.691510 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d9cce850-4a50-4a52-ac9b-147fcbde086a-srv-cert\") pod \"olm-operator-6b444d44fb-xgkxr\" (UID: \"d9cce850-4a50-4a52-ac9b-147fcbde086a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.693440 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d1b089c-8016-458b-83b5-84f602ea0ba7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hnl8s\" (UID: \"8d1b089c-8016-458b-83b5-84f602ea0ba7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.694002 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/84d4caa0-b9cd-40b1-ab0f-0903523cfbce-node-bootstrap-token\") pod \"machine-config-server-xd8hf\" (UID: \"84d4caa0-b9cd-40b1-ab0f-0903523cfbce\") " pod="openshift-machine-config-operator/machine-config-server-xd8hf" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.698049 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7p86\" (UniqueName: \"kubernetes.io/projected/119cdbf4-8721-4241-80c1-a85b8df6ce52-kube-api-access-v7p86\") pod \"migrator-59844c95c7-br8h5\" (UID: \"119cdbf4-8721-4241-80c1-a85b8df6ce52\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-br8h5" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.699845 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-br8h5" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.708530 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/973f371e-32fd-4c04-a76c-2a8e8cc54f5d-srv-cert\") pod \"catalog-operator-68c6474976-46znb\" (UID: \"973f371e-32fd-4c04-a76c-2a8e8cc54f5d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.713348 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nvfz\" (UniqueName: \"kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-kube-api-access-8nvfz\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.721995 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.722475 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh"] Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.741599 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: E0321 04:55:40.758380 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:41.258355835 +0000 UTC m=+246.340939463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.779643 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4pwx\" (UniqueName: \"kubernetes.io/projected/bc866c38-7a32-4dab-a741-204309afddc5-kube-api-access-j4pwx\") pod \"kube-storage-version-migrator-operator-b67b599dd-bnhsl\" (UID: \"bc866c38-7a32-4dab-a741-204309afddc5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.783516 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4msg\" (UniqueName: \"kubernetes.io/projected/6aa8e3c1-9f91-43c2-9eac-6c146199916e-kube-api-access-x4msg\") pod \"machine-config-operator-74547568cd-q8th9\" (UID: \"6aa8e3c1-9f91-43c2-9eac-6c146199916e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.801221 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-bound-sa-token\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.818265 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzv59\" (UniqueName: \"kubernetes.io/projected/8d1b089c-8016-458b-83b5-84f602ea0ba7-kube-api-access-vzv59\") pod \"marketplace-operator-79b997595-hnl8s\" (UID: \"8d1b089c-8016-458b-83b5-84f602ea0ba7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.856235 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4ffj8"] Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.856844 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.857288 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb8q2\" (UniqueName: \"kubernetes.io/projected/bf3a3e39-aab5-499e-a39a-0b74c9d7d666-kube-api-access-rb8q2\") pod \"ingress-canary-rsrjh\" (UID: \"bf3a3e39-aab5-499e-a39a-0b74c9d7d666\") " pod="openshift-ingress-canary/ingress-canary-rsrjh" Mar 21 04:55:40 crc kubenswrapper[4580]: E0321 04:55:40.858362 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:41.35833243 +0000 UTC m=+246.440916078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.859287 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: E0321 04:55:40.860673 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:41.360526004 +0000 UTC m=+246.443109632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.860866 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kmfk5"] Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.863415 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rsrjh" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.863892 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69895\" (UniqueName: \"kubernetes.io/projected/d9cce850-4a50-4a52-ac9b-147fcbde086a-kube-api-access-69895\") pod \"olm-operator-6b444d44fb-xgkxr\" (UID: \"d9cce850-4a50-4a52-ac9b-147fcbde086a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.864962 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-48dqz"] Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.872265 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stl52\" (UniqueName: \"kubernetes.io/projected/e7a9eebb-7ea8-4e16-9d70-f35fb12df177-kube-api-access-stl52\") pod \"csi-hostpathplugin-rrbz2\" (UID: \"e7a9eebb-7ea8-4e16-9d70-f35fb12df177\") " pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.887811 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.898948 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7spcf\" (UniqueName: \"kubernetes.io/projected/96a4ee3e-2a3f-44b8-8ab5-c4cd0b8681f6-kube-api-access-7spcf\") pod \"multus-admission-controller-857f4d67dd-jnpfl\" (UID: \"96a4ee3e-2a3f-44b8-8ab5-c4cd0b8681f6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jnpfl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.908988 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz"] Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.914448 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v5vg\" (UniqueName: \"kubernetes.io/projected/1e86baa4-7923-4d5e-bb0f-67085a562d68-kube-api-access-2v5vg\") pod \"service-ca-operator-777779d784-fzl5v\" (UID: \"1e86baa4-7923-4d5e-bb0f-67085a562d68\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.925552 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7njkm\" (UniqueName: \"kubernetes.io/projected/1714688f-61d5-436b-baaf-2668757942fd-kube-api-access-7njkm\") pod \"auto-csr-approver-29567814-8cxbg\" (UID: \"1714688f-61d5-436b-baaf-2668757942fd\") " pod="openshift-infra/auto-csr-approver-29567814-8cxbg" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.940331 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" event={"ID":"dd075d6a-7d15-421a-b546-19c5cab789d3","Type":"ContainerStarted","Data":"1a53a6bf3433a07a883c4085ccd0991f8a1e498769c3542f7214a699c66158c9"} Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.949247 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tltsc\" (UniqueName: \"kubernetes.io/projected/10f921f7-e891-49fd-825b-37843ebc2f29-kube-api-access-tltsc\") pod \"dns-default-9csml\" (UID: \"10f921f7-e891-49fd-825b-37843ebc2f29\") " pod="openshift-dns/dns-default-9csml" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.956159 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.968063 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.969344 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.969404 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vckmv"] Mar 21 04:55:40 crc kubenswrapper[4580]: E0321 04:55:40.969922 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:41.469885903 +0000 UTC m=+246.552469531 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.970281 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:40 crc kubenswrapper[4580]: E0321 04:55:40.972963 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:41.472936539 +0000 UTC m=+246.555520167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.973440 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.980355 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cwjc\" (UniqueName: \"kubernetes.io/projected/973f371e-32fd-4c04-a76c-2a8e8cc54f5d-kube-api-access-5cwjc\") pod \"catalog-operator-68c6474976-46znb\" (UID: \"973f371e-32fd-4c04-a76c-2a8e8cc54f5d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.990468 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" event={"ID":"05d9d8f9-d2d0-48ad-9583-5caaf4675cd4","Type":"ContainerStarted","Data":"4cb3efa5d34cec4835768d906d9bf8950d07eac97385a7f37ce10b4adb0de9a6"} Mar 21 04:55:40 crc kubenswrapper[4580]: I0321 04:55:40.990530 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" event={"ID":"05d9d8f9-d2d0-48ad-9583-5caaf4675cd4","Type":"ContainerStarted","Data":"dd455c8d07067ff7af8397e16de89f5c0f4881226e38b28f1fb5d5d399545d68"} Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.001079 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm27b\" (UniqueName: \"kubernetes.io/projected/84d4caa0-b9cd-40b1-ab0f-0903523cfbce-kube-api-access-fm27b\") pod \"machine-config-server-xd8hf\" (UID: \"84d4caa0-b9cd-40b1-ab0f-0903523cfbce\") " pod="openshift-machine-config-operator/machine-config-server-xd8hf" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.001319 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qpsf8" event={"ID":"5aabdab2-8553-4784-ac9e-8d1a42b5d32b","Type":"ContainerStarted","Data":"34c2f600f2c998d27e4c2616870a1fbfc06c4afb5256ed3e4736c79985884a59"} Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.002344 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.002401 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsmvm\" (UniqueName: \"kubernetes.io/projected/5185e797-2aa2-4012-96e0-0afb8c92a09e-kube-api-access-tsmvm\") pod \"packageserver-d55dfcdfc-rsrfm\" (UID: \"5185e797-2aa2-4012-96e0-0afb8c92a09e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.010208 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.016704 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p8tbn" event={"ID":"dcd68a9c-6c86-41a1-9d04-309a7db16685","Type":"ContainerStarted","Data":"fe9d8e4a2385e9449af31cc2d00378053d1386a262940b6c42adec140de4e291"} Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.019301 4580 patch_prober.go:28] interesting pod/console-operator-58897d9998-qpsf8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.019395 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qpsf8" podUID="5aabdab2-8553-4784-ac9e-8d1a42b5d32b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.024000 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zk7t\" (UniqueName: \"kubernetes.io/projected/cd4caa55-55f9-42e7-94d2-0069efa2a4ae-kube-api-access-9zk7t\") pod \"package-server-manager-789f6589d5-kzsgp\" (UID: \"cd4caa55-55f9-42e7-94d2-0069efa2a4ae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.029749 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" event={"ID":"da00c0bc-2ff1-4b15-be1f-8fac48921976","Type":"ContainerStarted","Data":"2d49e79a7ea59ba00dd320f6c34016b2c24513484af008fa62f19750323a395d"} Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.032485 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.046864 4580 generic.go:334] "Generic (PLEG): container finished" podID="edb0a4af-3bf8-4517-af4a-a7546e9acf87" containerID="465f4e792c7c441b2d16e2d62ee2618bc0f282f73dc3d80b3926bc3140520872" exitCode=0 Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.046999 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" event={"ID":"edb0a4af-3bf8-4517-af4a-a7546e9acf87","Type":"ContainerDied","Data":"465f4e792c7c441b2d16e2d62ee2618bc0f282f73dc3d80b3926bc3140520872"} Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.049913 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nlqt\" (UniqueName: \"kubernetes.io/projected/92b13d13-f88e-47cc-8815-34b54fd68711-kube-api-access-8nlqt\") pod \"collect-profiles-29567805-jljjq\" (UID: \"92b13d13-f88e-47cc-8815-34b54fd68711\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.050980 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.053550 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.062007 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w" event={"ID":"c9c2c4c6-575b-4b3a-91ba-f694e4005859","Type":"ContainerStarted","Data":"45b23a17e3b14003367f230e6097b0c3dc4d279b0c678ebd33b9c231a5e4ecaf"} Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.066238 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.079682 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vsqs\" (UniqueName: \"kubernetes.io/projected/377f7953-d6ed-4d67-a92c-07da9f5075d3-kube-api-access-2vsqs\") pod \"service-ca-9c57cc56f-gltvw\" (UID: \"377f7953-d6ed-4d67-a92c-07da9f5075d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-gltvw" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.080506 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:41 crc kubenswrapper[4580]: E0321 04:55:41.081971 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:41.581946209 +0000 UTC m=+246.664529837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.084087 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" Mar 21 04:55:41 crc kubenswrapper[4580]: W0321 04:55:41.105520 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcade120_6711_4045_9149_08985699febd.slice/crio-95fffe63db875a2bf780f900dccbde29370cf2f478e92b558183ea6a529d0890 WatchSource:0}: Error finding container 95fffe63db875a2bf780f900dccbde29370cf2f478e92b558183ea6a529d0890: Status 404 returned error can't find the container with id 95fffe63db875a2bf780f900dccbde29370cf2f478e92b558183ea6a529d0890 Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.105898 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.106860 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.111740 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dnnd6" event={"ID":"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc","Type":"ContainerStarted","Data":"dd84d6fbb314006cfd02a9c1ef855975d09467fceced2fb4cac4dde745c40574"} Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.111803 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dnnd6" event={"ID":"41dae12a-fc3b-4e2b-a64a-f4f4c791afbc","Type":"ContainerStarted","Data":"94390c0805e2e0b871ebf778d73c444b92e875407a43523e23aeda7598b0b9a2"} Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.112528 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-jnpfl" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.124653 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gltvw" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.151896 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" event={"ID":"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa","Type":"ContainerStarted","Data":"d7e2b0b7c75854afdeb2d40a97b77d3f238c4d80db84e41b1a945a43e62eab17"} Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.154312 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-8cxbg" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.174317 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9csml" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.181824 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xd8hf" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.182934 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:41 crc kubenswrapper[4580]: E0321 04:55:41.187983 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:41.687965955 +0000 UTC m=+246.770549583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.196481 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb" event={"ID":"a87434a7-bb73-4552-a502-c1a31119cff7","Type":"ContainerStarted","Data":"0eb14c24fe2d8c3a888b32e29192a27bc85f84de4064fcd4f595ab08fe77ec2b"} Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.229180 4580 generic.go:334] "Generic (PLEG): container finished" podID="1b720ad7-2de4-43c9-bab3-81c68d5dfde7" containerID="6e0b4633d5f6360fea5624c5a3b42bc69d714565720362259c548c3ba71285a2" exitCode=0 Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.229276 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fshpg" event={"ID":"1b720ad7-2de4-43c9-bab3-81c68d5dfde7","Type":"ContainerDied","Data":"6e0b4633d5f6360fea5624c5a3b42bc69d714565720362259c548c3ba71285a2"} Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.248365 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh" event={"ID":"ca24069e-6f60-4d3b-950e-bf6a87fa1955","Type":"ContainerStarted","Data":"20d0308a207e219d46b7d936ae1f52f2c18c72b33fd2b9114af1fd4665183a49"} Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.267398 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" event={"ID":"794862b0-a985-459a-98d4-cc82612d3593","Type":"ContainerStarted","Data":"a842032f4090e45a447d6311f734ecc782f43b08273a2e5a829145d74335ad7d"} Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.267892 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.281127 4580 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xh9jk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.281228 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" podUID="794862b0-a985-459a-98d4-cc82612d3593" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.292256 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:41 crc kubenswrapper[4580]: E0321 04:55:41.293036 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:41.793007676 +0000 UTC m=+246.875591304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.325017 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" event={"ID":"7b6460aa-400a-4a67-9ac2-93bf4268e610","Type":"ContainerStarted","Data":"c5ebd8c17d02e1a755cd7dbd8eb0326ec0510ea5b34d496310d8e2129945da13"} Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.341765 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.402983 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:41 crc kubenswrapper[4580]: E0321 04:55:41.404914 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:41.904895757 +0000 UTC m=+246.987479385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.505768 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:41 crc kubenswrapper[4580]: E0321 04:55:41.507645 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:42.00761757 +0000 UTC m=+247.090201198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.632308 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:41 crc kubenswrapper[4580]: E0321 04:55:41.632749 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:42.132732252 +0000 UTC m=+247.215315880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.710121 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jjvsm"] Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.733536 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:41 crc kubenswrapper[4580]: E0321 04:55:41.733988 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:42.233964128 +0000 UTC m=+247.316547756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.839775 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:41 crc kubenswrapper[4580]: E0321 04:55:41.840337 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:42.340313822 +0000 UTC m=+247.422897450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.879265 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-br8h5"] Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.942175 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:41 crc kubenswrapper[4580]: E0321 04:55:41.942811 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:42.442767508 +0000 UTC m=+247.525351136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:41 crc kubenswrapper[4580]: I0321 04:55:41.993964 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" podStartSLOduration=186.993931725 podStartE2EDuration="3m6.993931725s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:41.991295139 +0000 UTC m=+247.073878767" watchObservedRunningTime="2026-03-21 04:55:41.993931725 +0000 UTC m=+247.076515363" Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.045272 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.046044 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:42 crc kubenswrapper[4580]: E0321 04:55:42.047036 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:42.547015099 +0000 UTC m=+247.629598727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.063873 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:42 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:42 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:42 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.063951 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.151305 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:42 crc kubenswrapper[4580]: E0321 04:55:42.151432 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:42.651408954 +0000 UTC m=+247.733992582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.151666 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:42 crc kubenswrapper[4580]: E0321 04:55:42.152077 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:42.652069871 +0000 UTC m=+247.734653499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.161352 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jhb4w" podStartSLOduration=187.161324791 podStartE2EDuration="3m7.161324791s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:42.158381448 +0000 UTC m=+247.240965076" watchObservedRunningTime="2026-03-21 04:55:42.161324791 +0000 UTC m=+247.243908419" Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.206192 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwprr" podStartSLOduration=187.20617434 podStartE2EDuration="3m7.20617434s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:42.204263373 +0000 UTC m=+247.286847011" watchObservedRunningTime="2026-03-21 04:55:42.20617434 +0000 UTC m=+247.288757968" Mar 21 04:55:42 crc kubenswrapper[4580]: W0321 04:55:42.252727 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod119cdbf4_8721_4241_80c1_a85b8df6ce52.slice/crio-d4c490c88d771113345808d8649ef168a732add8a84c598de768f538d55d4487 WatchSource:0}: Error finding container d4c490c88d771113345808d8649ef168a732add8a84c598de768f538d55d4487: Status 404 returned error can't find the container with id d4c490c88d771113345808d8649ef168a732add8a84c598de768f538d55d4487 Mar 21 04:55:42 crc kubenswrapper[4580]: E0321 04:55:42.252875 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:42.752853635 +0000 UTC m=+247.835437273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.252763 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.253978 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:42 crc kubenswrapper[4580]: E0321 04:55:42.254640 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:42.754539757 +0000 UTC m=+247.837123385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.301605 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t"] Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.358045 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:42 crc kubenswrapper[4580]: E0321 04:55:42.358360 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:42.858325297 +0000 UTC m=+247.940908925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.358428 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:42 crc kubenswrapper[4580]: E0321 04:55:42.359276 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:42.8592685 +0000 UTC m=+247.941852128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.377173 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rsrjh"] Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.383426 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" event={"ID":"8a0187cd-b67d-47ee-a791-08939d1b4cc5","Type":"ContainerStarted","Data":"840b441b48b82640bf8e3bb30ac0d6e62a600d45f7f2a58b1f946b5e9782f0f2"} Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.415489 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" event={"ID":"dd075d6a-7d15-421a-b546-19c5cab789d3","Type":"ContainerStarted","Data":"05e0bbd87b3350ef6d49ef8f028c6afc6ee09f7cd46c1e9f541ab5075ff4cbba"} Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.445224 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4ffj8" event={"ID":"69b1f163-8594-47b1-85c7-3330e0d50d8f","Type":"ContainerStarted","Data":"cff581565b242ba49770a345e250537f914c006e3d27c4821f311cb5ddda8370"} Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.464524 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:42 crc kubenswrapper[4580]: E0321 04:55:42.465049 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:42.965009999 +0000 UTC m=+248.047593767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.465209 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:42 crc kubenswrapper[4580]: E0321 04:55:42.465943 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:42.965932962 +0000 UTC m=+248.048516590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.476685 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-48dqz" event={"ID":"bcade120-6711-4045-9149-08985699febd","Type":"ContainerStarted","Data":"95fffe63db875a2bf780f900dccbde29370cf2f478e92b558183ea6a529d0890"} Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.484681 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jjvsm" event={"ID":"48bae9ea-47af-484e-a8c4-b6c3e49438e5","Type":"ContainerStarted","Data":"91af1f23f6a2452e2b47642c9e3e7422db8f0d5632302db3db91db927067483f"} Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.496130 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" event={"ID":"329e99ed-cd13-46e2-af1f-3e2bc9eb692d","Type":"ContainerStarted","Data":"d0eb199c5c0ada23797d0934b6ba56bdadeeb452c57aa0750d3344ae67df2267"} Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.565615 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-qpsf8" podStartSLOduration=187.565580268 podStartE2EDuration="3m7.565580268s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:42.546376269 +0000 UTC m=+247.628959917" watchObservedRunningTime="2026-03-21 04:55:42.565580268 +0000 UTC m=+247.648163896" Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.571732 4580 generic.go:334] "Generic (PLEG): container finished" podID="05d9d8f9-d2d0-48ad-9583-5caaf4675cd4" containerID="4cb3efa5d34cec4835768d906d9bf8950d07eac97385a7f37ce10b4adb0de9a6" exitCode=0 Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.571877 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" event={"ID":"05d9d8f9-d2d0-48ad-9583-5caaf4675cd4","Type":"ContainerDied","Data":"4cb3efa5d34cec4835768d906d9bf8950d07eac97385a7f37ce10b4adb0de9a6"} Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.573244 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:42 crc kubenswrapper[4580]: E0321 04:55:42.574741 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:43.074718126 +0000 UTC m=+248.157301754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.679703 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:42 crc kubenswrapper[4580]: E0321 04:55:42.681072 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:43.181051459 +0000 UTC m=+248.263635087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.747666 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8"] Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.775659 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" podStartSLOduration=186.775620849 podStartE2EDuration="3m6.775620849s" podCreationTimestamp="2026-03-21 04:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:42.770946942 +0000 UTC m=+247.853530600" watchObservedRunningTime="2026-03-21 04:55:42.775620849 +0000 UTC m=+247.858204477" Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.784609 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:42 crc kubenswrapper[4580]: E0321 04:55:42.785086 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:43.285064994 +0000 UTC m=+248.367648622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.816143 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p8tbn" event={"ID":"dcd68a9c-6c86-41a1-9d04-309a7db16685","Type":"ContainerStarted","Data":"641876c48e369ad647c766141189e81b9cfbf9b88a04385aea08e9d0c9bbec93"} Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.841421 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-dnnd6" podStartSLOduration=187.84139958 podStartE2EDuration="3m7.84139958s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:42.839454332 +0000 UTC m=+247.922037960" watchObservedRunningTime="2026-03-21 04:55:42.84139958 +0000 UTC m=+247.923983208" Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.891317 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:42 crc kubenswrapper[4580]: E0321 04:55:42.891752 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:43.391736516 +0000 UTC m=+248.474320144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.941462 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-cvqg4" podStartSLOduration=187.941434946 podStartE2EDuration="3m7.941434946s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:42.938125484 +0000 UTC m=+248.020709132" watchObservedRunningTime="2026-03-21 04:55:42.941434946 +0000 UTC m=+248.024018564" Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.941892 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb"] Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.990650 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rrbz2"] Mar 21 04:55:42 crc kubenswrapper[4580]: I0321 04:55:42.992268 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:42 crc kubenswrapper[4580]: E0321 04:55:42.992722 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:43.492695555 +0000 UTC m=+248.575279183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.003742 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vckmv" event={"ID":"3a6d75f9-0ef3-4c99-8d68-7809b17fc607","Type":"ContainerStarted","Data":"1331d8bed1cc21b2fa3966b40307495a27448764059e9a673510f417345dce73"} Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.046363 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" event={"ID":"7b6460aa-400a-4a67-9ac2-93bf4268e610","Type":"ContainerStarted","Data":"f8403d79154c6f0843fa4a241605f65852559e6adc5d37ba7bb5b35434a9e300"} Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.057991 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-br8h5" event={"ID":"119cdbf4-8721-4241-80c1-a85b8df6ce52","Type":"ContainerStarted","Data":"d4c490c88d771113345808d8649ef168a732add8a84c598de768f538d55d4487"} Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.099181 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:43 crc kubenswrapper[4580]: E0321 04:55:43.099933 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:43.599919171 +0000 UTC m=+248.682502789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.101219 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb" event={"ID":"a87434a7-bb73-4552-a502-c1a31119cff7","Type":"ContainerStarted","Data":"aee9144dab57a104a1947b97911bfc806db7110bad129bf5160d6224d00c33e7"} Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.145932 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.164858 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-qpsf8" Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.213363 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:43 crc kubenswrapper[4580]: E0321 04:55:43.215367 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:43.71533943 +0000 UTC m=+248.797923068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:43 crc kubenswrapper[4580]: W0321 04:55:43.234572 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod239df9bb_a89d_49c9_b889_8cd32c1db001.slice/crio-d0dbacb1e41385e5343750670b9b0ba048a0be5ff0c1354b313dbba0e3991b45 WatchSource:0}: Error finding container d0dbacb1e41385e5343750670b9b0ba048a0be5ff0c1354b313dbba0e3991b45: Status 404 returned error can't find the container with id d0dbacb1e41385e5343750670b9b0ba048a0be5ff0c1354b313dbba0e3991b45 Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.285121 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:43 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:43 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:43 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.285211 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.286645 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hcpgb" podStartSLOduration=188.286620199 podStartE2EDuration="3m8.286620199s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:43.285596293 +0000 UTC m=+248.368179941" watchObservedRunningTime="2026-03-21 04:55:43.286620199 +0000 UTC m=+248.369203827" Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.327088 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:43 crc kubenswrapper[4580]: E0321 04:55:43.327482 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:43.827470688 +0000 UTC m=+248.910054316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.391723 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-vhp42" podStartSLOduration=188.391692051 podStartE2EDuration="3m8.391692051s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:43.385227069 +0000 UTC m=+248.467810717" watchObservedRunningTime="2026-03-21 04:55:43.391692051 +0000 UTC m=+248.474275679" Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.428918 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:43 crc kubenswrapper[4580]: E0321 04:55:43.429915 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:43.929883334 +0000 UTC m=+249.012466972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.536893 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:43 crc kubenswrapper[4580]: E0321 04:55:43.537772 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:44.037747955 +0000 UTC m=+249.120331583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.642261 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:43 crc kubenswrapper[4580]: E0321 04:55:43.642920 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:44.142895529 +0000 UTC m=+249.225479157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.744107 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:43 crc kubenswrapper[4580]: E0321 04:55:43.744507 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:44.244490403 +0000 UTC m=+249.327074031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.781154 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xh9jk"] Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.861096 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt"] Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.862718 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:43 crc kubenswrapper[4580]: E0321 04:55:43.863668 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:44.363638386 +0000 UTC m=+249.446222014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:43 crc kubenswrapper[4580]: I0321 04:55:43.965911 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:43 crc kubenswrapper[4580]: E0321 04:55:43.966734 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:44.466716928 +0000 UTC m=+249.549300556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.025756 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp"] Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.074122 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:44 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:44 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:44 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.074193 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.074449 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:44 crc kubenswrapper[4580]: E0321 04:55:44.075021 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:44.575002539 +0000 UTC m=+249.657586167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:44 crc kubenswrapper[4580]: W0321 04:55:44.120825 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb67fb1a5_405c_4419_baa5_e144d82fb317.slice/crio-a602e54e9be874fea1ffe2dd5ca529888cdabd154f645ddbd91a70b4bf8adc17 WatchSource:0}: Error finding container a602e54e9be874fea1ffe2dd5ca529888cdabd154f645ddbd91a70b4bf8adc17: Status 404 returned error can't find the container with id a602e54e9be874fea1ffe2dd5ca529888cdabd154f645ddbd91a70b4bf8adc17 Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.147573 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v"] Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.154088 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t" event={"ID":"26340f0a-c057-4d3a-aac0-45e31a795929","Type":"ContainerStarted","Data":"d8035c8f837989a01e498a800fcf8f2443a5f613bbbfe23a8480e82b28f2735a"} Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.167528 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" event={"ID":"973f371e-32fd-4c04-a76c-2a8e8cc54f5d","Type":"ContainerStarted","Data":"ceb840d69f057c21aaacddf14785d62f8236f2cf577c2d4873bf32a3d82bf069"} Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.176030 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:44 crc kubenswrapper[4580]: E0321 04:55:44.176463 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:44.67644843 +0000 UTC m=+249.759032068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.188032 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" event={"ID":"e7a9eebb-7ea8-4e16-9d70-f35fb12df177","Type":"ContainerStarted","Data":"17a9c2bcce679e09ff4b975f83fa0df7ad159f47bd3cb44991f9158d009ccf82"} Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.261603 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" event={"ID":"8ec0221a-e2f4-4bc9-b6d3-5faf99da51aa","Type":"ContainerStarted","Data":"c736774e888c109486bb3dfbae447ff4a08c735510aef8e04b34585f5f967d57"} Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.278651 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:44 crc kubenswrapper[4580]: E0321 04:55:44.279086 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:44.779061461 +0000 UTC m=+249.861645089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.324663 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" event={"ID":"8a0187cd-b67d-47ee-a791-08939d1b4cc5","Type":"ContainerStarted","Data":"1566cbf2900c36acb4406a3120c247475e2c36168025277fb90448b6fc7a755c"} Mar 21 04:55:44 crc kubenswrapper[4580]: W0321 04:55:44.333270 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e86baa4_7923_4d5e_bb0f_67085a562d68.slice/crio-6fa6093dc4bdddf17eab018b5c68835e4461e5b98d229bc073c74c0ac3c02d5b WatchSource:0}: Error finding container 6fa6093dc4bdddf17eab018b5c68835e4461e5b98d229bc073c74c0ac3c02d5b: Status 404 returned error can't find the container with id 6fa6093dc4bdddf17eab018b5c68835e4461e5b98d229bc073c74c0ac3c02d5b Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.369307 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xd8hf" event={"ID":"84d4caa0-b9cd-40b1-ab0f-0903523cfbce","Type":"ContainerStarted","Data":"fcd16aa98f875250b5eafd9c178ded3ca69b4c0505bb578bcc9e976f8f94266c"} Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.372294 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq"] Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.379987 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:44 crc kubenswrapper[4580]: E0321 04:55:44.381675 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:44.881662031 +0000 UTC m=+249.964245659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.410330 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-48dqz" event={"ID":"bcade120-6711-4045-9149-08985699febd","Type":"ContainerStarted","Data":"3bbe4a9cc386beb963f743da2e4139c3d7607ebb625d17066b6601f480d8ee82"} Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.440292 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rsrjh" event={"ID":"bf3a3e39-aab5-499e-a39a-0b74c9d7d666","Type":"ContainerStarted","Data":"d4c3a87cb8649e8322dd73ae964ffdb6350358364348b4caa7908ab2de6162b0"} Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.465252 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vckmv" event={"ID":"3a6d75f9-0ef3-4c99-8d68-7809b17fc607","Type":"ContainerStarted","Data":"43933b0c4895d00d1e630793ef932a973e8c921d25db1794a25394810504a5b1"} Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.481438 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:44 crc kubenswrapper[4580]: E0321 04:55:44.482717 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:44.982700482 +0000 UTC m=+250.065284110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.497567 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2rbc4" podStartSLOduration=189.497544352 podStartE2EDuration="3m9.497544352s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:44.394252935 +0000 UTC m=+249.476836573" watchObservedRunningTime="2026-03-21 04:55:44.497544352 +0000 UTC m=+249.580127980" Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.529374 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8" event={"ID":"239df9bb-a89d-49c9-b889-8cd32c1db001","Type":"ContainerStarted","Data":"d0dbacb1e41385e5343750670b9b0ba048a0be5ff0c1354b313dbba0e3991b45"} Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.542427 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4ffj8" event={"ID":"69b1f163-8594-47b1-85c7-3330e0d50d8f","Type":"ContainerStarted","Data":"60bf223bf0fffddead1d8dc64a91349c534f1f50c12865c63206c9a247adefbc"} Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.542473 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-4ffj8" Mar 21 04:55:44 crc kubenswrapper[4580]: W0321 04:55:44.563680 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92b13d13_f88e_47cc_8815_34b54fd68711.slice/crio-e7df0b6c67b0beee27bb469cede347f1cf133acaba77c61b28df3537000213a9 WatchSource:0}: Error finding container e7df0b6c67b0beee27bb469cede347f1cf133acaba77c61b28df3537000213a9: Status 404 returned error can't find the container with id e7df0b6c67b0beee27bb469cede347f1cf133acaba77c61b28df3537000213a9 Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.570412 4580 patch_prober.go:28] interesting pod/downloads-7954f5f757-4ffj8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.570467 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.582911 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:44 crc kubenswrapper[4580]: E0321 04:55:44.583372 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:45.083356333 +0000 UTC m=+250.165939961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.618823 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-48dqz" podStartSLOduration=189.618803238 podStartE2EDuration="3m9.618803238s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:44.606847429 +0000 UTC m=+249.689431057" watchObservedRunningTime="2026-03-21 04:55:44.618803238 +0000 UTC m=+249.701386866" Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.623304 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cl7vz" podStartSLOduration=189.62328786 podStartE2EDuration="3m9.62328786s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:44.498011774 +0000 UTC m=+249.580595402" watchObservedRunningTime="2026-03-21 04:55:44.62328786 +0000 UTC m=+249.705871488" Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.636482 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gltvw"] Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.676320 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-4ffj8" podStartSLOduration=189.676299272 podStartE2EDuration="3m9.676299272s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:44.675708678 +0000 UTC m=+249.758292306" watchObservedRunningTime="2026-03-21 04:55:44.676299272 +0000 UTC m=+249.758882900" Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.686274 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:44 crc kubenswrapper[4580]: E0321 04:55:44.690563 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:45.190538568 +0000 UTC m=+250.273122266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.691603 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:44 crc kubenswrapper[4580]: E0321 04:55:44.692032 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:45.192016834 +0000 UTC m=+250.274600462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.794645 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jnpfl"] Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.812365 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:44 crc kubenswrapper[4580]: E0321 04:55:44.812904 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:45.31288429 +0000 UTC m=+250.395467908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.813684 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp"] Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.829123 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9"] Mar 21 04:55:44 crc kubenswrapper[4580]: I0321 04:55:44.916016 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:44 crc kubenswrapper[4580]: E0321 04:55:44.916383 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:45.416365042 +0000 UTC m=+250.498948670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:44 crc kubenswrapper[4580]: W0321 04:55:44.945419 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd4caa55_55f9_42e7_94d2_0069efa2a4ae.slice/crio-0851824abd15f6aeee27278d67f9d88d6e2663c5546bc68ff79ce90fbc458482 WatchSource:0}: Error finding container 0851824abd15f6aeee27278d67f9d88d6e2663c5546bc68ff79ce90fbc458482: Status 404 returned error can't find the container with id 0851824abd15f6aeee27278d67f9d88d6e2663c5546bc68ff79ce90fbc458482 Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.002627 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-8cxbg"] Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.013011 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9csml"] Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.017327 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:45 crc kubenswrapper[4580]: E0321 04:55:45.017715 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:45.517697241 +0000 UTC m=+250.600280869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.053706 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:45 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:45 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:45 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.054070 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.118831 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:45 crc kubenswrapper[4580]: E0321 04:55:45.126030 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:45.626014033 +0000 UTC m=+250.708597651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.225837 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:45 crc kubenswrapper[4580]: E0321 04:55:45.226320 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:45.726300736 +0000 UTC m=+250.808884364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.249460 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr"] Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.274495 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm"] Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.312900 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl"] Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.329352 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:45 crc kubenswrapper[4580]: E0321 04:55:45.329821 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:45.829805978 +0000 UTC m=+250.912389606 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.356771 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hnl8s"] Mar 21 04:55:45 crc kubenswrapper[4580]: W0321 04:55:45.358770 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1714688f_61d5_436b_baaf_2668757942fd.slice/crio-2e83675d6593419ae3dcd0918b0bacf5852f8886683253cf706a71bf0f7992c8 WatchSource:0}: Error finding container 2e83675d6593419ae3dcd0918b0bacf5852f8886683253cf706a71bf0f7992c8: Status 404 returned error can't find the container with id 2e83675d6593419ae3dcd0918b0bacf5852f8886683253cf706a71bf0f7992c8 Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.363310 4580 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:55:45 crc kubenswrapper[4580]: W0321 04:55:45.406061 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10f921f7_e891_49fd_825b_37843ebc2f29.slice/crio-f87bd9ea61db95a3841e59db58e2db9b58f71da3c4593f04d3ce4da91380b60d WatchSource:0}: Error finding container f87bd9ea61db95a3841e59db58e2db9b58f71da3c4593f04d3ce4da91380b60d: Status 404 returned error can't find the container with id f87bd9ea61db95a3841e59db58e2db9b58f71da3c4593f04d3ce4da91380b60d Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.430503 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:45 crc kubenswrapper[4580]: E0321 04:55:45.430916 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:45.930897661 +0000 UTC m=+251.013481279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.534146 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:45 crc kubenswrapper[4580]: E0321 04:55:45.534469 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:46.034456995 +0000 UTC m=+251.117040623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:45 crc kubenswrapper[4580]: W0321 04:55:45.562931 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5185e797_2aa2_4012_96e0_0afb8c92a09e.slice/crio-61b9b9643735e76bdf9d6c144f0fac6d30b7ee2d63ebddc89ef3fdf2f9ac8bf2 WatchSource:0}: Error finding container 61b9b9643735e76bdf9d6c144f0fac6d30b7ee2d63ebddc89ef3fdf2f9ac8bf2: Status 404 returned error can't find the container with id 61b9b9643735e76bdf9d6c144f0fac6d30b7ee2d63ebddc89ef3fdf2f9ac8bf2 Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.563920 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567814-8cxbg" event={"ID":"1714688f-61d5-436b-baaf-2668757942fd","Type":"ContainerStarted","Data":"2e83675d6593419ae3dcd0918b0bacf5852f8886683253cf706a71bf0f7992c8"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.570176 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" event={"ID":"973f371e-32fd-4c04-a76c-2a8e8cc54f5d","Type":"ContainerStarted","Data":"dfff0cf926987724ce81ffcfbe1b8d22ae2ba8c903a97e83a2d4b370547bcd4e"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.571607 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.573874 4580 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-46znb container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.573921 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" podUID="973f371e-32fd-4c04-a76c-2a8e8cc54f5d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.587530 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jnpfl" event={"ID":"96a4ee3e-2a3f-44b8-8ab5-c4cd0b8681f6","Type":"ContainerStarted","Data":"702bfe7bed8d0c176e227d0fd96a5fc39065364e17132a05c1336f2971391ce4"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.608414 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fshpg" event={"ID":"1b720ad7-2de4-43c9-bab3-81c68d5dfde7","Type":"ContainerStarted","Data":"75e97717520bdde586fcfd3a5b29b1263e8a549216d98f353415758b80a933f7"} Mar 21 04:55:45 crc kubenswrapper[4580]: W0321 04:55:45.611953 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d1b089c_8016_458b_83b5_84f602ea0ba7.slice/crio-3d20e3ae0c79ed7c02d855a4aab846999212f3095fa15da782d453cdf873e181 WatchSource:0}: Error finding container 3d20e3ae0c79ed7c02d855a4aab846999212f3095fa15da782d453cdf873e181: Status 404 returned error can't find the container with id 3d20e3ae0c79ed7c02d855a4aab846999212f3095fa15da782d453cdf873e181 Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.626819 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" podStartSLOduration=190.626764358 podStartE2EDuration="3m10.626764358s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:45.625188589 +0000 UTC m=+250.707772217" watchObservedRunningTime="2026-03-21 04:55:45.626764358 +0000 UTC m=+250.709347986" Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.634861 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:45 crc kubenswrapper[4580]: E0321 04:55:45.635346 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:46.135314781 +0000 UTC m=+251.217898409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.635598 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:45 crc kubenswrapper[4580]: E0321 04:55:45.636316 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:46.136289266 +0000 UTC m=+251.218872894 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.662951 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" event={"ID":"6aa8e3c1-9f91-43c2-9eac-6c146199916e","Type":"ContainerStarted","Data":"7ee5306165ecc46bcbc4cd3ba5c9232b74f0215dac3243a3de47eef6987413f6"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.663032 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v" event={"ID":"1e86baa4-7923-4d5e-bb0f-67085a562d68","Type":"ContainerStarted","Data":"6fa6093dc4bdddf17eab018b5c68835e4461e5b98d229bc073c74c0ac3c02d5b"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.663045 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gltvw" event={"ID":"377f7953-d6ed-4d67-a92c-07da9f5075d3","Type":"ContainerStarted","Data":"eef483ac0a4865f0533406dc681a98a42eb415d4bcf58c527d302e4b9df3f265"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.663067 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jjvsm" event={"ID":"48bae9ea-47af-484e-a8c4-b6c3e49438e5","Type":"ContainerStarted","Data":"68e0ec424a6b14e1bea960fd6a7c08890262fdcc64295a53fadfd419fbae8c78"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.663586 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t" event={"ID":"26340f0a-c057-4d3a-aac0-45e31a795929","Type":"ContainerStarted","Data":"53cc1819410309008391f8827d8f856e9fd808e6e545d5ca073411c9d4678561"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.670923 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xd8hf" event={"ID":"84d4caa0-b9cd-40b1-ab0f-0903523cfbce","Type":"ContainerStarted","Data":"c956976020de28bb7d37b2cca3430401cb36711d009528b6adff2db1ba30aac4"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.683096 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-br8h5" event={"ID":"119cdbf4-8721-4241-80c1-a85b8df6ce52","Type":"ContainerStarted","Data":"8a765af834dc5599882e95ddce83658e0106bd3e926566c49fc62cd2146c76e7"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.716093 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p8tbn" event={"ID":"dcd68a9c-6c86-41a1-9d04-309a7db16685","Type":"ContainerStarted","Data":"f4eae64714b3bdf8291f6ef2ddac404e23208f9531227125c1a29d74a1b0709e"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.724198 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" event={"ID":"edb0a4af-3bf8-4517-af4a-a7546e9acf87","Type":"ContainerStarted","Data":"62c583aade93c117c8776d39dc0e87a414f69a948f966f858437c1c5a80477bf"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.734077 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh" event={"ID":"ca24069e-6f60-4d3b-950e-bf6a87fa1955","Type":"ContainerStarted","Data":"77cf65a12c73c408360da6f6f16beffd69def9dac1a3660c85eef82b82a48046"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.740667 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:45 crc kubenswrapper[4580]: E0321 04:55:45.740879 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:46.240854225 +0000 UTC m=+251.323437853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.741157 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:45 crc kubenswrapper[4580]: E0321 04:55:45.743392 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:46.243364037 +0000 UTC m=+251.325947665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.755355 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp" event={"ID":"cd4caa55-55f9-42e7-94d2-0069efa2a4ae","Type":"ContainerStarted","Data":"0851824abd15f6aeee27278d67f9d88d6e2663c5546bc68ff79ce90fbc458482"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.762164 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" event={"ID":"dd075d6a-7d15-421a-b546-19c5cab789d3","Type":"ContainerStarted","Data":"d578e4e514f73d9a9f0fe5272a180fe7ca1b808fcec1ea477215515005b572a1"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.771915 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" event={"ID":"d9cce850-4a50-4a52-ac9b-147fcbde086a","Type":"ContainerStarted","Data":"178b2d554fa238dc1378fc8d50bfc7ef5f5e7e1a7376b0de940657092f05615e"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.789529 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" event={"ID":"92b13d13-f88e-47cc-8815-34b54fd68711","Type":"ContainerStarted","Data":"e7df0b6c67b0beee27bb469cede347f1cf133acaba77c61b28df3537000213a9"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.792231 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" event={"ID":"b67fb1a5-405c-4419-baa5-e144d82fb317","Type":"ContainerStarted","Data":"a602e54e9be874fea1ffe2dd5ca529888cdabd154f645ddbd91a70b4bf8adc17"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.799476 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" event={"ID":"05d9d8f9-d2d0-48ad-9583-5caaf4675cd4","Type":"ContainerStarted","Data":"14b3a405af59949a7d9932db1b25a75d7c00cc6803a2eeb12164f6d34986f417"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.803948 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.814083 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" event={"ID":"329e99ed-cd13-46e2-af1f-3e2bc9eb692d","Type":"ContainerStarted","Data":"48bb1e8f8ad1bd53e4d3b9e166e5caf3d041f9043411e1bb80f8e1802fd5c3ab"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.815571 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.830173 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9csml" event={"ID":"10f921f7-e891-49fd-825b-37843ebc2f29","Type":"ContainerStarted","Data":"f87bd9ea61db95a3841e59db58e2db9b58f71da3c4593f04d3ce4da91380b60d"} Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.830509 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" podUID="794862b0-a985-459a-98d4-cc82612d3593" containerName="controller-manager" containerID="cri-o://a842032f4090e45a447d6311f734ecc782f43b08273a2e5a829145d74335ad7d" gracePeriod=30 Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.832415 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" podUID="8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc" containerName="route-controller-manager" containerID="cri-o://49544dea4693e0ee539af213e7cbfcc9b1333d129a0e81353ce02c7b2902b6c9" gracePeriod=30 Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.834128 4580 patch_prober.go:28] interesting pod/downloads-7954f5f757-4ffj8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.834165 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.834240 4580 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-kmfk5 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.834254 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" podUID="329e99ed-cd13-46e2-af1f-3e2bc9eb692d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.842527 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:45 crc kubenswrapper[4580]: E0321 04:55:45.844316 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:46.344290235 +0000 UTC m=+251.426873863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:45 crc kubenswrapper[4580]: I0321 04:55:45.979014 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:45 crc kubenswrapper[4580]: E0321 04:55:45.984734 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:46.484711759 +0000 UTC m=+251.567295387 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.053753 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:46 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:46 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:46 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.053853 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.080051 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:46 crc kubenswrapper[4580]: E0321 04:55:46.080518 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:46.580499679 +0000 UTC m=+251.663083307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.117820 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.181527 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:46 crc kubenswrapper[4580]: E0321 04:55:46.181919 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:46.681899979 +0000 UTC m=+251.764483607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.283349 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:46 crc kubenswrapper[4580]: E0321 04:55:46.283598 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:46.783559576 +0000 UTC m=+251.866143214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.283884 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:46 crc kubenswrapper[4580]: E0321 04:55:46.284678 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:46.784656383 +0000 UTC m=+251.867240011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.384616 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:46 crc kubenswrapper[4580]: E0321 04:55:46.385161 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:46.88514029 +0000 UTC m=+251.967723918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.385685 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" podStartSLOduration=191.385671034 podStartE2EDuration="3m11.385671034s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:46.311297498 +0000 UTC m=+251.393881136" watchObservedRunningTime="2026-03-21 04:55:46.385671034 +0000 UTC m=+251.468254672" Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.386637 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xgchr" podStartSLOduration=191.386631698 podStartE2EDuration="3m11.386631698s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:46.377799607 +0000 UTC m=+251.460383245" watchObservedRunningTime="2026-03-21 04:55:46.386631698 +0000 UTC m=+251.469215326" Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.418383 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-p8tbn" podStartSLOduration=191.418360769 podStartE2EDuration="3m11.418360769s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:46.415129659 +0000 UTC m=+251.497713327" watchObservedRunningTime="2026-03-21 04:55:46.418360769 +0000 UTC m=+251.500944397" Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.449587 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" podStartSLOduration=191.449567268 podStartE2EDuration="3m11.449567268s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:46.447514717 +0000 UTC m=+251.530098355" watchObservedRunningTime="2026-03-21 04:55:46.449567268 +0000 UTC m=+251.532150896" Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.465255 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jjvsm" podStartSLOduration=191.465232959 podStartE2EDuration="3m11.465232959s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:46.463533046 +0000 UTC m=+251.546116684" watchObservedRunningTime="2026-03-21 04:55:46.465232959 +0000 UTC m=+251.547816587" Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.486193 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:46 crc kubenswrapper[4580]: E0321 04:55:46.486546 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:46.98653311 +0000 UTC m=+252.069116738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.496003 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" podStartSLOduration=190.495980106 podStartE2EDuration="3m10.495980106s" podCreationTimestamp="2026-03-21 04:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:46.494892099 +0000 UTC m=+251.577475747" watchObservedRunningTime="2026-03-21 04:55:46.495980106 +0000 UTC m=+251.578563734" Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.547294 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-xd8hf" podStartSLOduration=9.547266136 podStartE2EDuration="9.547266136s" podCreationTimestamp="2026-03-21 04:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:46.512876478 +0000 UTC m=+251.595460106" watchObservedRunningTime="2026-03-21 04:55:46.547266136 +0000 UTC m=+251.629849764" Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.568561 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m4kxh" podStartSLOduration=191.568529486 podStartE2EDuration="3m11.568529486s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:46.564913196 +0000 UTC m=+251.647496824" watchObservedRunningTime="2026-03-21 04:55:46.568529486 +0000 UTC m=+251.651113134" Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.587825 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:46 crc kubenswrapper[4580]: E0321 04:55:46.588063 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:47.088020803 +0000 UTC m=+252.170604441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.588416 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:46 crc kubenswrapper[4580]: E0321 04:55:46.588873 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:47.088852453 +0000 UTC m=+252.171436081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.689921 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:46 crc kubenswrapper[4580]: E0321 04:55:46.690415 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:47.190396537 +0000 UTC m=+252.272980165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.793737 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:46 crc kubenswrapper[4580]: E0321 04:55:46.794372 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:47.294198607 +0000 UTC m=+252.376782235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:46 crc kubenswrapper[4580]: I0321 04:55:46.895504 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:46 crc kubenswrapper[4580]: E0321 04:55:46.896434 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:47.396414127 +0000 UTC m=+252.478997755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.003091 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:47 crc kubenswrapper[4580]: E0321 04:55:47.004241 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:47.504222467 +0000 UTC m=+252.586806095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.060514 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:47 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:47 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:47 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.061038 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.061575 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-br8h5" event={"ID":"119cdbf4-8721-4241-80c1-a85b8df6ce52","Type":"ContainerStarted","Data":"e1617b59317121f3db1d02c9a4c3e721120bc71c309a51068fe95ae64868e1a8"} Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.101745 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" event={"ID":"92b13d13-f88e-47cc-8815-34b54fd68711","Type":"ContainerStarted","Data":"a6be6f616e65ef14a7724f8ea93ae87aaa63f9e0f48e3f37e0b4c8261348e254"} Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.104945 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:47 crc kubenswrapper[4580]: E0321 04:55:47.106871 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:47.606847248 +0000 UTC m=+252.689430876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.112456 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-br8h5" podStartSLOduration=192.112433127 podStartE2EDuration="3m12.112433127s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:47.097683799 +0000 UTC m=+252.180267447" watchObservedRunningTime="2026-03-21 04:55:47.112433127 +0000 UTC m=+252.195016755" Mar 21 04:55:47 crc kubenswrapper[4580]: E0321 04:55:47.113337 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:47.61331995 +0000 UTC m=+252.695903578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.112571 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.156702 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" podStartSLOduration=192.156684342 podStartE2EDuration="3m12.156684342s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:47.144855226 +0000 UTC m=+252.227438854" watchObservedRunningTime="2026-03-21 04:55:47.156684342 +0000 UTC m=+252.239267970" Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.171931 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vckmv" event={"ID":"3a6d75f9-0ef3-4c99-8d68-7809b17fc607","Type":"ContainerStarted","Data":"b1cab6479c730c54814476526a6870750770fe51495ec6494ad6ea0d0ead8716"} Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.190453 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" event={"ID":"5185e797-2aa2-4012-96e0-0afb8c92a09e","Type":"ContainerStarted","Data":"61b9b9643735e76bdf9d6c144f0fac6d30b7ee2d63ebddc89ef3fdf2f9ac8bf2"} Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.214925 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:47 crc kubenswrapper[4580]: E0321 04:55:47.215528 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:47.715493479 +0000 UTC m=+252.798077107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.226116 4580 generic.go:334] "Generic (PLEG): container finished" podID="8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc" containerID="49544dea4693e0ee539af213e7cbfcc9b1333d129a0e81353ce02c7b2902b6c9" exitCode=0 Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.226196 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" event={"ID":"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc","Type":"ContainerDied","Data":"49544dea4693e0ee539af213e7cbfcc9b1333d129a0e81353ce02c7b2902b6c9"} Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.249034 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vckmv" podStartSLOduration=192.249008305 podStartE2EDuration="3m12.249008305s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:47.246913323 +0000 UTC m=+252.329496941" watchObservedRunningTime="2026-03-21 04:55:47.249008305 +0000 UTC m=+252.331591933" Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.249577 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rsrjh" podStartSLOduration=10.249569809 podStartE2EDuration="10.249569809s" podCreationTimestamp="2026-03-21 04:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:47.178470215 +0000 UTC m=+252.261053843" watchObservedRunningTime="2026-03-21 04:55:47.249569809 +0000 UTC m=+252.332153447" Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.267656 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" event={"ID":"b67fb1a5-405c-4419-baa5-e144d82fb317","Type":"ContainerStarted","Data":"586efb4a6542b2f417e750aa013ba3c1474c820b4032c4dd1206dea45721045b"} Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.293838 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8" podStartSLOduration=192.293765032 podStartE2EDuration="3m12.293765032s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:47.290524821 +0000 UTC m=+252.373108479" watchObservedRunningTime="2026-03-21 04:55:47.293765032 +0000 UTC m=+252.376348670" Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.319197 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:47 crc kubenswrapper[4580]: E0321 04:55:47.322090 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:47.822066128 +0000 UTC m=+252.904649756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.400523 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" event={"ID":"8d1b089c-8016-458b-83b5-84f602ea0ba7","Type":"ContainerStarted","Data":"3d20e3ae0c79ed7c02d855a4aab846999212f3095fa15da782d453cdf873e181"} Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.431379 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:47 crc kubenswrapper[4580]: E0321 04:55:47.431983 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:47.9319654 +0000 UTC m=+253.014549028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.439760 4580 generic.go:334] "Generic (PLEG): container finished" podID="794862b0-a985-459a-98d4-cc82612d3593" containerID="a842032f4090e45a447d6311f734ecc782f43b08273a2e5a829145d74335ad7d" exitCode=0 Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.439867 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" event={"ID":"794862b0-a985-459a-98d4-cc82612d3593","Type":"ContainerDied","Data":"a842032f4090e45a447d6311f734ecc782f43b08273a2e5a829145d74335ad7d"} Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.470136 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl" event={"ID":"bc866c38-7a32-4dab-a741-204309afddc5","Type":"ContainerStarted","Data":"f6ddae0830ecced2760916ae72746e8efdb3fb89b4209c5897f555bdb8855767"} Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.514306 4580 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-46znb container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.514402 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" podUID="973f371e-32fd-4c04-a76c-2a8e8cc54f5d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.514682 4580 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-c8fh9 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.514724 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" podUID="05d9d8f9-d2d0-48ad-9583-5caaf4675cd4" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.535956 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:47 crc kubenswrapper[4580]: E0321 04:55:47.536322 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:48.036307874 +0000 UTC m=+253.118891502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.638685 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:47 crc kubenswrapper[4580]: E0321 04:55:47.639044 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:48.139001425 +0000 UTC m=+253.221585053 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.647411 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:47 crc kubenswrapper[4580]: E0321 04:55:47.653607 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:48.153586559 +0000 UTC m=+253.236170377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.756420 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:47 crc kubenswrapper[4580]: E0321 04:55:47.756769 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:48.256734113 +0000 UTC m=+253.339317741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.756884 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:47 crc kubenswrapper[4580]: E0321 04:55:47.757379 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:48.257350558 +0000 UTC m=+253.339934186 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.858117 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:47 crc kubenswrapper[4580]: E0321 04:55:47.858686 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:48.358644425 +0000 UTC m=+253.441228053 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.864972 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:47 crc kubenswrapper[4580]: E0321 04:55:47.865890 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:48.365866986 +0000 UTC m=+253.448450614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:47 crc kubenswrapper[4580]: I0321 04:55:47.965816 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:47 crc kubenswrapper[4580]: E0321 04:55:47.966187 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:48.466150748 +0000 UTC m=+253.548734376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.050367 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:48 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:48 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:48 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.050748 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.069754 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:48 crc kubenswrapper[4580]: E0321 04:55:48.070205 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:48.570188834 +0000 UTC m=+253.652772462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.179976 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:48 crc kubenswrapper[4580]: E0321 04:55:48.180660 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:48.680619909 +0000 UTC m=+253.763203537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.215303 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.263572 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-gltvw" podStartSLOduration=192.261448586 podStartE2EDuration="3m12.261448586s" podCreationTimestamp="2026-03-21 04:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:47.567735298 +0000 UTC m=+252.650318936" watchObservedRunningTime="2026-03-21 04:55:48.261448586 +0000 UTC m=+253.344032214" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.270230 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7d77679dc6-9xg8t"] Mar 21 04:55:48 crc kubenswrapper[4580]: E0321 04:55:48.270567 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794862b0-a985-459a-98d4-cc82612d3593" containerName="controller-manager" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.270585 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="794862b0-a985-459a-98d4-cc82612d3593" containerName="controller-manager" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.270689 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="794862b0-a985-459a-98d4-cc82612d3593" containerName="controller-manager" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.271272 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.279482 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.281884 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-client-ca\") pod \"794862b0-a985-459a-98d4-cc82612d3593\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.281923 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-config\") pod \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.281953 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-proxy-ca-bundles\") pod \"794862b0-a985-459a-98d4-cc82612d3593\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.281984 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-serving-cert\") pod \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.282010 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqdxh\" (UniqueName: \"kubernetes.io/projected/794862b0-a985-459a-98d4-cc82612d3593-kube-api-access-kqdxh\") pod \"794862b0-a985-459a-98d4-cc82612d3593\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.282030 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/794862b0-a985-459a-98d4-cc82612d3593-serving-cert\") pod \"794862b0-a985-459a-98d4-cc82612d3593\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.282059 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-client-ca\") pod \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.282082 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-config\") pod \"794862b0-a985-459a-98d4-cc82612d3593\" (UID: \"794862b0-a985-459a-98d4-cc82612d3593\") " Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.282122 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6c99\" (UniqueName: \"kubernetes.io/projected/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-kube-api-access-p6c99\") pod \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\" (UID: \"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc\") " Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.282233 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-config\") pod \"controller-manager-7d77679dc6-9xg8t\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.282257 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-client-ca\") pod \"controller-manager-7d77679dc6-9xg8t\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.282304 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-proxy-ca-bundles\") pod \"controller-manager-7d77679dc6-9xg8t\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.282331 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f808bccc-3450-447c-8b0a-7909fe189edd-serving-cert\") pod \"controller-manager-7d77679dc6-9xg8t\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.282377 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.282402 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82zm2\" (UniqueName: \"kubernetes.io/projected/f808bccc-3450-447c-8b0a-7909fe189edd-kube-api-access-82zm2\") pod \"controller-manager-7d77679dc6-9xg8t\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.283757 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-config" (OuterVolumeSpecName: "config") pod "794862b0-a985-459a-98d4-cc82612d3593" (UID: "794862b0-a985-459a-98d4-cc82612d3593"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.284586 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-config" (OuterVolumeSpecName: "config") pod "8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc" (UID: "8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.287015 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-client-ca" (OuterVolumeSpecName: "client-ca") pod "794862b0-a985-459a-98d4-cc82612d3593" (UID: "794862b0-a985-459a-98d4-cc82612d3593"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:55:48 crc kubenswrapper[4580]: E0321 04:55:48.287347 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:48.787333662 +0000 UTC m=+253.869917290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.290017 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "794862b0-a985-459a-98d4-cc82612d3593" (UID: "794862b0-a985-459a-98d4-cc82612d3593"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.297158 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-client-ca" (OuterVolumeSpecName: "client-ca") pod "8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc" (UID: "8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.313764 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-kube-api-access-p6c99" (OuterVolumeSpecName: "kube-api-access-p6c99") pod "8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc" (UID: "8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc"). InnerVolumeSpecName "kube-api-access-p6c99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.325342 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794862b0-a985-459a-98d4-cc82612d3593-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "794862b0-a985-459a-98d4-cc82612d3593" (UID: "794862b0-a985-459a-98d4-cc82612d3593"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.327652 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc" (UID: "8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.329124 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d77679dc6-9xg8t"] Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.329724 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/794862b0-a985-459a-98d4-cc82612d3593-kube-api-access-kqdxh" (OuterVolumeSpecName: "kube-api-access-kqdxh") pod "794862b0-a985-459a-98d4-cc82612d3593" (UID: "794862b0-a985-459a-98d4-cc82612d3593"). InnerVolumeSpecName "kube-api-access-kqdxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.390636 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.391277 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-config\") pod \"controller-manager-7d77679dc6-9xg8t\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.391385 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-client-ca\") pod \"controller-manager-7d77679dc6-9xg8t\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: E0321 04:55:48.391465 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:48.891432599 +0000 UTC m=+253.974016227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.391540 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-proxy-ca-bundles\") pod \"controller-manager-7d77679dc6-9xg8t\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.391614 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f808bccc-3450-447c-8b0a-7909fe189edd-serving-cert\") pod \"controller-manager-7d77679dc6-9xg8t\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.391703 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.391762 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82zm2\" (UniqueName: \"kubernetes.io/projected/f808bccc-3450-447c-8b0a-7909fe189edd-kube-api-access-82zm2\") pod \"controller-manager-7d77679dc6-9xg8t\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.391864 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6c99\" (UniqueName: \"kubernetes.io/projected/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-kube-api-access-p6c99\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.391889 4580 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.391899 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.391910 4580 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.391921 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.391933 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqdxh\" (UniqueName: \"kubernetes.io/projected/794862b0-a985-459a-98d4-cc82612d3593-kube-api-access-kqdxh\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.391943 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/794862b0-a985-459a-98d4-cc82612d3593-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.391954 4580 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.391964 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/794862b0-a985-459a-98d4-cc82612d3593-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:48 crc kubenswrapper[4580]: E0321 04:55:48.392582 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:48.892570038 +0000 UTC m=+253.975153666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.393495 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-client-ca\") pod \"controller-manager-7d77679dc6-9xg8t\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.394041 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-config\") pod \"controller-manager-7d77679dc6-9xg8t\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.394281 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-proxy-ca-bundles\") pod \"controller-manager-7d77679dc6-9xg8t\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.399307 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f808bccc-3450-447c-8b0a-7909fe189edd-serving-cert\") pod \"controller-manager-7d77679dc6-9xg8t\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.427468 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82zm2\" (UniqueName: \"kubernetes.io/projected/f808bccc-3450-447c-8b0a-7909fe189edd-kube-api-access-82zm2\") pod \"controller-manager-7d77679dc6-9xg8t\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.494761 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:48 crc kubenswrapper[4580]: E0321 04:55:48.497406 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:48.997319051 +0000 UTC m=+254.079902679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.521956 4580 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-kmfk5 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.522083 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" podUID="329e99ed-cd13-46e2-af1f-3e2bc9eb692d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.589394 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl" event={"ID":"bc866c38-7a32-4dab-a741-204309afddc5","Type":"ContainerStarted","Data":"88fe376abf3326ed9590db2380a7b1691ba246cc35f5ca22499e0cc79874cf26"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.596829 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gltvw" event={"ID":"377f7953-d6ed-4d67-a92c-07da9f5075d3","Type":"ContainerStarted","Data":"c01d69f27126f0160de64557c0b36e83e729f095e73c830e6279d0a61378fc25"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.597456 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.597609 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:48 crc kubenswrapper[4580]: E0321 04:55:48.597926 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:49.097911611 +0000 UTC m=+254.180495239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.625809 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jnpfl" event={"ID":"96a4ee3e-2a3f-44b8-8ab5-c4cd0b8681f6","Type":"ContainerStarted","Data":"b153d9b66b780ce2aa682247f455cafb3a9ddc277d595796e9bb29f5569edf49"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.625870 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jnpfl" event={"ID":"96a4ee3e-2a3f-44b8-8ab5-c4cd0b8681f6","Type":"ContainerStarted","Data":"79bab7a4a1fea6ba3b57422d6a6c363e77846663d62baf057f1a82fbe6a41674"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.656272 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bnhsl" podStartSLOduration=193.656249867 podStartE2EDuration="3m13.656249867s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:48.613388547 +0000 UTC m=+253.695972185" watchObservedRunningTime="2026-03-21 04:55:48.656249867 +0000 UTC m=+253.738833495" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.657189 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" event={"ID":"5185e797-2aa2-4012-96e0-0afb8c92a09e","Type":"ContainerStarted","Data":"223d4bb4630456d9e2e8801d701888669ba9026a909a9ac9f6300dc9c6172aa1"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.657894 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.661276 4580 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rsrfm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" start-of-body= Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.661359 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" podUID="5185e797-2aa2-4012-96e0-0afb8c92a09e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.691206 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-jnpfl" podStartSLOduration=193.691175508 podStartE2EDuration="3m13.691175508s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:48.65955846 +0000 UTC m=+253.742142098" watchObservedRunningTime="2026-03-21 04:55:48.691175508 +0000 UTC m=+253.773759136" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.692699 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" podStartSLOduration=192.692690726 podStartE2EDuration="3m12.692690726s" podCreationTimestamp="2026-03-21 04:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:48.688909962 +0000 UTC m=+253.771493610" watchObservedRunningTime="2026-03-21 04:55:48.692690726 +0000 UTC m=+253.775274354" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.703648 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" event={"ID":"6aa8e3c1-9f91-43c2-9eac-6c146199916e","Type":"ContainerStarted","Data":"492ed0d6b6df38dd60cccc0d021b27d9e9065dfc6262a88b609f9a7d0dca99e7"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.703856 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" event={"ID":"6aa8e3c1-9f91-43c2-9eac-6c146199916e","Type":"ContainerStarted","Data":"86d36ed66c2b5b90c84ce2f65870f4f4eee9e6b8efafa3581410a2dea55353ed"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.705945 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:48 crc kubenswrapper[4580]: E0321 04:55:48.707675 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:49.207644889 +0000 UTC m=+254.290228517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.716872 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" event={"ID":"8d1b089c-8016-458b-83b5-84f602ea0ba7","Type":"ContainerStarted","Data":"f330ce73140deb65d3b48634f1b3afe76b09042aa1b936265d66878f92f36b12"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.718156 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.722013 4580 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hnl8s container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.722089 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" podUID="8d1b089c-8016-458b-83b5-84f602ea0ba7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.742817 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q8th9" podStartSLOduration=193.742723045 podStartE2EDuration="3m13.742723045s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:48.741212747 +0000 UTC m=+253.823796405" watchObservedRunningTime="2026-03-21 04:55:48.742723045 +0000 UTC m=+253.825306673" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.749678 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" event={"ID":"8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc","Type":"ContainerDied","Data":"571bec49042e797343b28addacb4e9dc7ba0af0a2c60b367dabf355bf9aa4fc3"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.749750 4580 scope.go:117] "RemoveContainer" containerID="49544dea4693e0ee539af213e7cbfcc9b1333d129a0e81353ce02c7b2902b6c9" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.749963 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.782981 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v" event={"ID":"1e86baa4-7923-4d5e-bb0f-67085a562d68","Type":"ContainerStarted","Data":"58413fec8acb78c2fa63c9db1e7df2cc8efb276fd3341cb37dcedabe2d9aee66"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.807423 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" podStartSLOduration=192.807405069 podStartE2EDuration="3m12.807405069s" podCreationTimestamp="2026-03-21 04:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:48.805931502 +0000 UTC m=+253.888515150" watchObservedRunningTime="2026-03-21 04:55:48.807405069 +0000 UTC m=+253.889988697" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.810070 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:48 crc kubenswrapper[4580]: E0321 04:55:48.810580 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:49.310563007 +0000 UTC m=+254.393146635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.830701 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9csml" event={"ID":"10f921f7-e891-49fd-825b-37843ebc2f29","Type":"ContainerStarted","Data":"9ff09e54b49415943c54fff96395ed61ba0487292d3467c678704a5f75388b12"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.854514 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fshpg" event={"ID":"1b720ad7-2de4-43c9-bab3-81c68d5dfde7","Type":"ContainerStarted","Data":"121b67fd00ef40b208252195038ebd067c4c995994a70737b429625393956225"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.869513 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" event={"ID":"d9cce850-4a50-4a52-ac9b-147fcbde086a","Type":"ContainerStarted","Data":"41322b57af2c01352fc8fd3cda28875862942ef9142da1c59cad5174b3387258"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.870232 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.872379 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t" event={"ID":"26340f0a-c057-4d3a-aac0-45e31a795929","Type":"ContainerStarted","Data":"79e55674fd26c7139bcfd08e2a16a025be8d75e710ce45717d6acef86f70b856"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.882551 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rsrjh" event={"ID":"bf3a3e39-aab5-499e-a39a-0b74c9d7d666","Type":"ContainerStarted","Data":"f582492023741f9ee53ef05ef8001de292d315c31fb8208046f7712e2fab6c5a"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.884960 4580 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xgkxr container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.885006 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" podUID="d9cce850-4a50-4a52-ac9b-147fcbde086a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.885628 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" event={"ID":"e7a9eebb-7ea8-4e16-9d70-f35fb12df177","Type":"ContainerStarted","Data":"5680e55d807f38bcb3656a82687a1b676f5746e3d5b9a3e0610b7ceedd636b0e"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.894362 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fzl5v" podStartSLOduration=192.894342188 podStartE2EDuration="3m12.894342188s" podCreationTimestamp="2026-03-21 04:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:48.855302864 +0000 UTC m=+253.937886492" watchObservedRunningTime="2026-03-21 04:55:48.894342188 +0000 UTC m=+253.976925816" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.896097 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp"] Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.906648 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fkfvp"] Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.911291 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:48 crc kubenswrapper[4580]: E0321 04:55:48.912854 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:49.412832849 +0000 UTC m=+254.495416487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.916747 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" event={"ID":"b67fb1a5-405c-4419-baa5-e144d82fb317","Type":"ContainerStarted","Data":"27dbe4b4066b26d99f5e579f271b750117f168f258d2f8b85955b0d34951edf4"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.932578 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-62hm8" event={"ID":"239df9bb-a89d-49c9-b889-8cd32c1db001","Type":"ContainerStarted","Data":"20b0032929808d963a0003c753d4f29b57ce129ee14595f8b30e929a14a803f3"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.960905 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" event={"ID":"794862b0-a985-459a-98d4-cc82612d3593","Type":"ContainerDied","Data":"219ebaba072964d000a1323fd9f0aae6f78872e382b87a95841b92ceefe9e8dc"} Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.961389 4580 scope.go:117] "RemoveContainer" containerID="a842032f4090e45a447d6311f734ecc782f43b08273a2e5a829145d74335ad7d" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.961038 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xh9jk" Mar 21 04:55:48 crc kubenswrapper[4580]: I0321 04:55:48.971363 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-fshpg" podStartSLOduration=193.971340409 podStartE2EDuration="3m13.971340409s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:48.969882863 +0000 UTC m=+254.052466521" watchObservedRunningTime="2026-03-21 04:55:48.971340409 +0000 UTC m=+254.053924037" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.012236 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp" event={"ID":"cd4caa55-55f9-42e7-94d2-0069efa2a4ae","Type":"ContainerStarted","Data":"c7dd55a86bd256d034281ac7c075b7c3461fd03be3ec1b76559947ef73b470c8"} Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.015230 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp" event={"ID":"cd4caa55-55f9-42e7-94d2-0069efa2a4ae","Type":"ContainerStarted","Data":"58abf5951f8b70d327da671fc4298d8094967da2ac25d85c79d8ad3893be935c"} Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.022700 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:49 crc kubenswrapper[4580]: E0321 04:55:49.025539 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:49.525519511 +0000 UTC m=+254.608103139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.032872 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jp84t" podStartSLOduration=194.032847744 podStartE2EDuration="3m14.032847744s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:49.032832383 +0000 UTC m=+254.115416011" watchObservedRunningTime="2026-03-21 04:55:49.032847744 +0000 UTC m=+254.115431372" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.058118 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:49 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:49 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:49 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.058180 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.077050 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.078464 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.124944 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:49 crc kubenswrapper[4580]: E0321 04:55:49.126879 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:49.626850639 +0000 UTC m=+254.709434427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.166354 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" podStartSLOduration=194.166310474 podStartE2EDuration="3m14.166310474s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:49.165203496 +0000 UTC m=+254.247787134" watchObservedRunningTime="2026-03-21 04:55:49.166310474 +0000 UTC m=+254.248894092" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.178774 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46znb" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.201302 4580 ???:1] "http: TLS handshake error from 192.168.126.11:57268: no serving certificate available for the kubelet" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.227393 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:49 crc kubenswrapper[4580]: E0321 04:55:49.227960 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:49.727941761 +0000 UTC m=+254.810525389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.254633 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t7bxt" podStartSLOduration=194.254608577 podStartE2EDuration="3m14.254608577s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:49.235993252 +0000 UTC m=+254.318576890" watchObservedRunningTime="2026-03-21 04:55:49.254608577 +0000 UTC m=+254.337192215" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.276126 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.321007 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xh9jk"] Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.336992 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:49 crc kubenswrapper[4580]: E0321 04:55:49.337644 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:49.837614028 +0000 UTC m=+254.920197656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.341011 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xh9jk"] Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.388544 4580 ???:1] "http: TLS handshake error from 192.168.126.11:57270: no serving certificate available for the kubelet" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.406893 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp" podStartSLOduration=193.406869196 podStartE2EDuration="3m13.406869196s" podCreationTimestamp="2026-03-21 04:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:49.389068742 +0000 UTC m=+254.471652390" watchObservedRunningTime="2026-03-21 04:55:49.406869196 +0000 UTC m=+254.489452824" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.412503 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.412870 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.442009 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:49 crc kubenswrapper[4580]: E0321 04:55:49.442372 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:49.942355591 +0000 UTC m=+255.024939219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.495430 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.544989 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:49 crc kubenswrapper[4580]: E0321 04:55:49.545407 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:50.045391222 +0000 UTC m=+255.127974840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.570344 4580 ???:1] "http: TLS handshake error from 192.168.126.11:57286: no serving certificate available for the kubelet" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.641235 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="794862b0-a985-459a-98d4-cc82612d3593" path="/var/lib/kubelet/pods/794862b0-a985-459a-98d4-cc82612d3593/volumes" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.642148 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc" path="/var/lib/kubelet/pods/8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc/volumes" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.648675 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:49 crc kubenswrapper[4580]: E0321 04:55:49.649096 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:50.149082629 +0000 UTC m=+255.231666257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.725140 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n99sq"] Mar 21 04:55:49 crc kubenswrapper[4580]: E0321 04:55:49.725357 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc" containerName="route-controller-manager" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.725373 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc" containerName="route-controller-manager" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.725467 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5c10bd-9b6e-41dc-b0a9-30b5f59ad6cc" containerName="route-controller-manager" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.726217 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.741091 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.750856 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.751110 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbthv\" (UniqueName: \"kubernetes.io/projected/82874992-faa8-4c73-955b-ffe5f02726a7-kube-api-access-pbthv\") pod \"certified-operators-n99sq\" (UID: \"82874992-faa8-4c73-955b-ffe5f02726a7\") " pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.751234 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82874992-faa8-4c73-955b-ffe5f02726a7-utilities\") pod \"certified-operators-n99sq\" (UID: \"82874992-faa8-4c73-955b-ffe5f02726a7\") " pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.751259 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82874992-faa8-4c73-955b-ffe5f02726a7-catalog-content\") pod \"certified-operators-n99sq\" (UID: \"82874992-faa8-4c73-955b-ffe5f02726a7\") " pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:55:49 crc kubenswrapper[4580]: E0321 04:55:49.751449 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:50.251430463 +0000 UTC m=+255.334014091 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.800111 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n99sq"] Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.854236 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbthv\" (UniqueName: \"kubernetes.io/projected/82874992-faa8-4c73-955b-ffe5f02726a7-kube-api-access-pbthv\") pod \"certified-operators-n99sq\" (UID: \"82874992-faa8-4c73-955b-ffe5f02726a7\") " pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.854414 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.854461 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82874992-faa8-4c73-955b-ffe5f02726a7-utilities\") pod \"certified-operators-n99sq\" (UID: \"82874992-faa8-4c73-955b-ffe5f02726a7\") " pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.854482 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82874992-faa8-4c73-955b-ffe5f02726a7-catalog-content\") pod \"certified-operators-n99sq\" (UID: \"82874992-faa8-4c73-955b-ffe5f02726a7\") " pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:55:49 crc kubenswrapper[4580]: E0321 04:55:49.855113 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:50.35509143 +0000 UTC m=+255.437675048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.855181 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82874992-faa8-4c73-955b-ffe5f02726a7-catalog-content\") pod \"certified-operators-n99sq\" (UID: \"82874992-faa8-4c73-955b-ffe5f02726a7\") " pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.855596 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82874992-faa8-4c73-955b-ffe5f02726a7-utilities\") pod \"certified-operators-n99sq\" (UID: \"82874992-faa8-4c73-955b-ffe5f02726a7\") " pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.862771 4580 ???:1] "http: TLS handshake error from 192.168.126.11:57300: no serving certificate available for the kubelet" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.910175 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-c8fh9" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.922821 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d77679dc6-9xg8t"] Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.933038 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbthv\" (UniqueName: \"kubernetes.io/projected/82874992-faa8-4c73-955b-ffe5f02726a7-kube-api-access-pbthv\") pod \"certified-operators-n99sq\" (UID: \"82874992-faa8-4c73-955b-ffe5f02726a7\") " pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:55:49 crc kubenswrapper[4580]: I0321 04:55:49.955665 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:49 crc kubenswrapper[4580]: E0321 04:55:49.957965 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:50.457938186 +0000 UTC m=+255.540521814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.010220 4580 ???:1] "http: TLS handshake error from 192.168.126.11:57308: no serving certificate available for the kubelet" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.042754 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.043383 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.047708 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:50 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:50 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:50 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.047756 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.059974 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:50 crc kubenswrapper[4580]: E0321 04:55:50.060310 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:50.56029952 +0000 UTC m=+255.642883148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.066527 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9csml" event={"ID":"10f921f7-e891-49fd-825b-37843ebc2f29","Type":"ContainerStarted","Data":"e4d937b5a488a0805c3842ae6c6620405df3040b7e7b1bd5f40f142cb0b7797f"} Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.068357 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-9csml" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.075674 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jq5t4"] Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.076813 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.108885 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jq5t4"] Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.118394 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" event={"ID":"f808bccc-3450-447c-8b0a-7909fe189edd","Type":"ContainerStarted","Data":"6ecfdea3ae881c688ca9dfc72e0126c16d9b34aff4b1feec0187f183380ba216"} Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.136528 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9csml" podStartSLOduration=13.136509012 podStartE2EDuration="13.136509012s" podCreationTimestamp="2026-03-21 04:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:50.133389814 +0000 UTC m=+255.215973442" watchObservedRunningTime="2026-03-21 04:55:50.136509012 +0000 UTC m=+255.219092640" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.140175 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.144203 4580 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hnl8s container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.144293 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" podUID="8d1b089c-8016-458b-83b5-84f602ea0ba7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.160600 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.160733 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.161194 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.161901 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dd3cd12-741f-4993-8b39-994545e15c2c-utilities\") pod \"certified-operators-jq5t4\" (UID: \"1dd3cd12-741f-4993-8b39-994545e15c2c\") " pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.161985 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dd3cd12-741f-4993-8b39-994545e15c2c-catalog-content\") pod \"certified-operators-jq5t4\" (UID: \"1dd3cd12-741f-4993-8b39-994545e15c2c\") " pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.162229 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlp7m\" (UniqueName: \"kubernetes.io/projected/1dd3cd12-741f-4993-8b39-994545e15c2c-kube-api-access-tlp7m\") pod \"certified-operators-jq5t4\" (UID: \"1dd3cd12-741f-4993-8b39-994545e15c2c\") " pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:55:50 crc kubenswrapper[4580]: E0321 04:55:50.163457 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:50.663417883 +0000 UTC m=+255.746001511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.164207 4580 patch_prober.go:28] interesting pod/console-f9d7485db-48dqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.164275 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-48dqz" podUID="bcade120-6711-4045-9149-08985699febd" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.164612 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.168260 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgk58" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.230260 4580 ???:1] "http: TLS handshake error from 192.168.126.11:57324: no serving certificate available for the kubelet" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.267095 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dd3cd12-741f-4993-8b39-994545e15c2c-catalog-content\") pod \"certified-operators-jq5t4\" (UID: \"1dd3cd12-741f-4993-8b39-994545e15c2c\") " pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.267173 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.267327 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlp7m\" (UniqueName: \"kubernetes.io/projected/1dd3cd12-741f-4993-8b39-994545e15c2c-kube-api-access-tlp7m\") pod \"certified-operators-jq5t4\" (UID: \"1dd3cd12-741f-4993-8b39-994545e15c2c\") " pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:55:50 crc kubenswrapper[4580]: E0321 04:55:50.268369 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:50.768350901 +0000 UTC m=+255.850934529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.282279 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dd3cd12-741f-4993-8b39-994545e15c2c-utilities\") pod \"certified-operators-jq5t4\" (UID: \"1dd3cd12-741f-4993-8b39-994545e15c2c\") " pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.282547 4580 patch_prober.go:28] interesting pod/downloads-7954f5f757-4ffj8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.282588 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.286277 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-77nmx"] Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.286955 4580 patch_prober.go:28] interesting pod/downloads-7954f5f757-4ffj8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.287046 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.291242 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dd3cd12-741f-4993-8b39-994545e15c2c-catalog-content\") pod \"certified-operators-jq5t4\" (UID: \"1dd3cd12-741f-4993-8b39-994545e15c2c\") " pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.303167 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.306608 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dd3cd12-741f-4993-8b39-994545e15c2c-utilities\") pod \"certified-operators-jq5t4\" (UID: \"1dd3cd12-741f-4993-8b39-994545e15c2c\") " pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.314133 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.360721 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-77nmx"] Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.393904 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlp7m\" (UniqueName: \"kubernetes.io/projected/1dd3cd12-741f-4993-8b39-994545e15c2c-kube-api-access-tlp7m\") pod \"certified-operators-jq5t4\" (UID: \"1dd3cd12-741f-4993-8b39-994545e15c2c\") " pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.405714 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.406101 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2dsq\" (UniqueName: \"kubernetes.io/projected/37b3e873-7ca5-4413-9998-6aaf824d6cd7-kube-api-access-l2dsq\") pod \"community-operators-77nmx\" (UID: \"37b3e873-7ca5-4413-9998-6aaf824d6cd7\") " pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.406140 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b3e873-7ca5-4413-9998-6aaf824d6cd7-catalog-content\") pod \"community-operators-77nmx\" (UID: \"37b3e873-7ca5-4413-9998-6aaf824d6cd7\") " pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.406215 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b3e873-7ca5-4413-9998-6aaf824d6cd7-utilities\") pod \"community-operators-77nmx\" (UID: \"37b3e873-7ca5-4413-9998-6aaf824d6cd7\") " pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:55:50 crc kubenswrapper[4580]: E0321 04:55:50.406333 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:50.906317964 +0000 UTC m=+255.988901592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.445119 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.509605 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.509654 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b3e873-7ca5-4413-9998-6aaf824d6cd7-utilities\") pod \"community-operators-77nmx\" (UID: \"37b3e873-7ca5-4413-9998-6aaf824d6cd7\") " pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.509737 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2dsq\" (UniqueName: \"kubernetes.io/projected/37b3e873-7ca5-4413-9998-6aaf824d6cd7-kube-api-access-l2dsq\") pod \"community-operators-77nmx\" (UID: \"37b3e873-7ca5-4413-9998-6aaf824d6cd7\") " pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.509756 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b3e873-7ca5-4413-9998-6aaf824d6cd7-catalog-content\") pod \"community-operators-77nmx\" (UID: \"37b3e873-7ca5-4413-9998-6aaf824d6cd7\") " pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.510192 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b3e873-7ca5-4413-9998-6aaf824d6cd7-catalog-content\") pod \"community-operators-77nmx\" (UID: \"37b3e873-7ca5-4413-9998-6aaf824d6cd7\") " pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:55:50 crc kubenswrapper[4580]: E0321 04:55:50.510465 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:51.010452852 +0000 UTC m=+256.093036480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.510821 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b3e873-7ca5-4413-9998-6aaf824d6cd7-utilities\") pod \"community-operators-77nmx\" (UID: \"37b3e873-7ca5-4413-9998-6aaf824d6cd7\") " pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.571264 4580 ???:1] "http: TLS handshake error from 192.168.126.11:57326: no serving certificate available for the kubelet" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.574846 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6ppjj"] Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.598357 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.610894 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:50 crc kubenswrapper[4580]: E0321 04:55:50.611306 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:51.111289568 +0000 UTC m=+256.193873196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.649837 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2dsq\" (UniqueName: \"kubernetes.io/projected/37b3e873-7ca5-4413-9998-6aaf824d6cd7-kube-api-access-l2dsq\") pod \"community-operators-77nmx\" (UID: \"37b3e873-7ca5-4413-9998-6aaf824d6cd7\") " pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.658262 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.664021 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6ppjj"] Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.712895 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.713000 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c002830b-7ac1-4912-9b31-bad37ac63104-catalog-content\") pod \"community-operators-6ppjj\" (UID: \"c002830b-7ac1-4912-9b31-bad37ac63104\") " pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.713031 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6tbg\" (UniqueName: \"kubernetes.io/projected/c002830b-7ac1-4912-9b31-bad37ac63104-kube-api-access-b6tbg\") pod \"community-operators-6ppjj\" (UID: \"c002830b-7ac1-4912-9b31-bad37ac63104\") " pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.713086 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c002830b-7ac1-4912-9b31-bad37ac63104-utilities\") pod \"community-operators-6ppjj\" (UID: \"c002830b-7ac1-4912-9b31-bad37ac63104\") " pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:55:50 crc kubenswrapper[4580]: E0321 04:55:50.713392 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:51.213380355 +0000 UTC m=+256.295963983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.815023 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:50 crc kubenswrapper[4580]: E0321 04:55:50.815261 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:51.315226666 +0000 UTC m=+256.397810304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.815917 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c002830b-7ac1-4912-9b31-bad37ac63104-catalog-content\") pod \"community-operators-6ppjj\" (UID: \"c002830b-7ac1-4912-9b31-bad37ac63104\") " pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.815956 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6tbg\" (UniqueName: \"kubernetes.io/projected/c002830b-7ac1-4912-9b31-bad37ac63104-kube-api-access-b6tbg\") pod \"community-operators-6ppjj\" (UID: \"c002830b-7ac1-4912-9b31-bad37ac63104\") " pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.816036 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c002830b-7ac1-4912-9b31-bad37ac63104-utilities\") pod \"community-operators-6ppjj\" (UID: \"c002830b-7ac1-4912-9b31-bad37ac63104\") " pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.816068 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:50 crc kubenswrapper[4580]: E0321 04:55:50.816428 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:51.316412036 +0000 UTC m=+256.398995674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.816716 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c002830b-7ac1-4912-9b31-bad37ac63104-utilities\") pod \"community-operators-6ppjj\" (UID: \"c002830b-7ac1-4912-9b31-bad37ac63104\") " pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.816863 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c002830b-7ac1-4912-9b31-bad37ac63104-catalog-content\") pod \"community-operators-6ppjj\" (UID: \"c002830b-7ac1-4912-9b31-bad37ac63104\") " pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.846083 4580 ???:1] "http: TLS handshake error from 192.168.126.11:57342: no serving certificate available for the kubelet" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.870198 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6tbg\" (UniqueName: \"kubernetes.io/projected/c002830b-7ac1-4912-9b31-bad37ac63104-kube-api-access-b6tbg\") pod \"community-operators-6ppjj\" (UID: \"c002830b-7ac1-4912-9b31-bad37ac63104\") " pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.917444 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:50 crc kubenswrapper[4580]: E0321 04:55:50.917857 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:51.417839257 +0000 UTC m=+256.500422875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:50 crc kubenswrapper[4580]: I0321 04:55:50.928353 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.020390 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:51 crc kubenswrapper[4580]: E0321 04:55:51.020836 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:51.520821436 +0000 UTC m=+256.603405064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.025054 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c"] Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.025977 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.051010 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:51 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:51 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:51 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.051495 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.054921 4580 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hnl8s container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.054988 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" podUID="8d1b089c-8016-458b-83b5-84f602ea0ba7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.055079 4580 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hnl8s container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.055096 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" podUID="8d1b089c-8016-458b-83b5-84f602ea0ba7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.109111 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.109301 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.109413 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.109560 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.109613 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.109822 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.123857 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.124048 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk8xl\" (UniqueName: \"kubernetes.io/projected/8ba19f04-7cc2-44da-bb2d-cd4200fce325-kube-api-access-mk8xl\") pod \"route-controller-manager-d6d44c798-fvr5c\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.124088 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ba19f04-7cc2-44da-bb2d-cd4200fce325-config\") pod \"route-controller-manager-d6d44c798-fvr5c\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.124161 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ba19f04-7cc2-44da-bb2d-cd4200fce325-serving-cert\") pod \"route-controller-manager-d6d44c798-fvr5c\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:51 crc kubenswrapper[4580]: E0321 04:55:51.124281 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:51.624247847 +0000 UTC m=+256.706831475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.124402 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ba19f04-7cc2-44da-bb2d-cd4200fce325-client-ca\") pod \"route-controller-manager-d6d44c798-fvr5c\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.124966 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c"] Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.130258 4580 ???:1] "http: TLS handshake error from 192.168.126.11:57346: no serving certificate available for the kubelet" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.148184 4580 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rsrfm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.148262 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" podUID="5185e797-2aa2-4012-96e0-0afb8c92a09e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.174205 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" event={"ID":"f808bccc-3450-447c-8b0a-7909fe189edd","Type":"ContainerStarted","Data":"6720f1dce227ed179bd467b0ce0472d88884ddfb7999e75d935fd53bca12757c"} Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.174321 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.178079 4580 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hnl8s container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.178142 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" podUID="8d1b089c-8016-458b-83b5-84f602ea0ba7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.200554 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.226835 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ba19f04-7cc2-44da-bb2d-cd4200fce325-client-ca\") pod \"route-controller-manager-d6d44c798-fvr5c\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.226884 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk8xl\" (UniqueName: \"kubernetes.io/projected/8ba19f04-7cc2-44da-bb2d-cd4200fce325-kube-api-access-mk8xl\") pod \"route-controller-manager-d6d44c798-fvr5c\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.226917 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ba19f04-7cc2-44da-bb2d-cd4200fce325-config\") pod \"route-controller-manager-d6d44c798-fvr5c\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.226955 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.226999 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ba19f04-7cc2-44da-bb2d-cd4200fce325-serving-cert\") pod \"route-controller-manager-d6d44c798-fvr5c\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.229044 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ba19f04-7cc2-44da-bb2d-cd4200fce325-client-ca\") pod \"route-controller-manager-d6d44c798-fvr5c\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.230044 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ba19f04-7cc2-44da-bb2d-cd4200fce325-config\") pod \"route-controller-manager-d6d44c798-fvr5c\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:51 crc kubenswrapper[4580]: E0321 04:55:51.230285 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:51.730273861 +0000 UTC m=+256.812857479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.260751 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ba19f04-7cc2-44da-bb2d-cd4200fce325-serving-cert\") pod \"route-controller-manager-d6d44c798-fvr5c\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.269752 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk8xl\" (UniqueName: \"kubernetes.io/projected/8ba19f04-7cc2-44da-bb2d-cd4200fce325-kube-api-access-mk8xl\") pod \"route-controller-manager-d6d44c798-fvr5c\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.295962 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" podStartSLOduration=6.29593822 podStartE2EDuration="6.29593822s" podCreationTimestamp="2026-03-21 04:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:51.232108857 +0000 UTC m=+256.314692505" watchObservedRunningTime="2026-03-21 04:55:51.29593822 +0000 UTC m=+256.378521848" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.327906 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:51 crc kubenswrapper[4580]: E0321 04:55:51.330005 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:51.829984009 +0000 UTC m=+256.912567637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.371028 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.431373 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:51 crc kubenswrapper[4580]: E0321 04:55:51.431866 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:51.931850891 +0000 UTC m=+257.014434519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.536176 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:51 crc kubenswrapper[4580]: E0321 04:55:51.536614 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:52.036312848 +0000 UTC m=+257.118896476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.536681 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:51 crc kubenswrapper[4580]: E0321 04:55:51.537136 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:52.037127908 +0000 UTC m=+257.119711536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.638551 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:51 crc kubenswrapper[4580]: E0321 04:55:51.638709 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:52.138677182 +0000 UTC m=+257.221260810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.639410 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:51 crc kubenswrapper[4580]: E0321 04:55:51.639894 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:52.139880852 +0000 UTC m=+257.222464540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.740023 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:51 crc kubenswrapper[4580]: E0321 04:55:51.740685 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:52.240661676 +0000 UTC m=+257.323245304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.811402 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n99sq"] Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.852162 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:51 crc kubenswrapper[4580]: E0321 04:55:51.852644 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:52.35262583 +0000 UTC m=+257.435209458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:51 crc kubenswrapper[4580]: I0321 04:55:51.956134 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:51 crc kubenswrapper[4580]: E0321 04:55:51.956661 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:52.456640895 +0000 UTC m=+257.539224523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.057744 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:52 crc kubenswrapper[4580]: E0321 04:55:52.058283 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:52.558267111 +0000 UTC m=+257.640850749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.115431 4580 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rsrfm container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.115521 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" podUID="5185e797-2aa2-4012-96e0-0afb8c92a09e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.128053 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:52 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:52 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:52 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.128134 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.152160 4580 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rsrfm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.152589 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" podUID="5185e797-2aa2-4012-96e0-0afb8c92a09e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.162472 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:52 crc kubenswrapper[4580]: E0321 04:55:52.162937 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:52.662920472 +0000 UTC m=+257.745504100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.229470 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4kpsb"] Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.231021 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.242022 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n99sq" event={"ID":"82874992-faa8-4c73-955b-ffe5f02726a7","Type":"ContainerStarted","Data":"f88e379f490605aba0768efdb6440d644a39f136b6d9d5e331fb19d954796f98"} Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.258384 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" event={"ID":"e7a9eebb-7ea8-4e16-9d70-f35fb12df177","Type":"ContainerStarted","Data":"d91cfaa739b248499aa6a5681df6e83ec5a0bd44d3c7e51c2b6ae1e0a91d87ec"} Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.277797 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:52 crc kubenswrapper[4580]: E0321 04:55:52.278150 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:52.778137357 +0000 UTC m=+257.860720985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:52 crc kubenswrapper[4580]: W0321 04:55:52.297874 4580 reflector.go:561] object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb": failed to list *v1.Secret: secrets "redhat-marketplace-dockercfg-x2ctb" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Mar 21 04:55:52 crc kubenswrapper[4580]: E0321 04:55:52.297935 4580 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"redhat-marketplace-dockercfg-x2ctb\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"redhat-marketplace-dockercfg-x2ctb\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.321972 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4kpsb"] Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.379942 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:52 crc kubenswrapper[4580]: E0321 04:55:52.380093 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:52.88006949 +0000 UTC m=+257.962653118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.380357 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/484933df-fe17-42ec-99da-d1187d674051-catalog-content\") pod \"redhat-marketplace-4kpsb\" (UID: \"484933df-fe17-42ec-99da-d1187d674051\") " pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.380417 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpp8l\" (UniqueName: \"kubernetes.io/projected/484933df-fe17-42ec-99da-d1187d674051-kube-api-access-gpp8l\") pod \"redhat-marketplace-4kpsb\" (UID: \"484933df-fe17-42ec-99da-d1187d674051\") " pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.380539 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.380825 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/484933df-fe17-42ec-99da-d1187d674051-utilities\") pod \"redhat-marketplace-4kpsb\" (UID: \"484933df-fe17-42ec-99da-d1187d674051\") " pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:55:52 crc kubenswrapper[4580]: E0321 04:55:52.385957 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:52.885938737 +0000 UTC m=+257.968522365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.472811 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jq5t4"] Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.484412 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.484919 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/484933df-fe17-42ec-99da-d1187d674051-catalog-content\") pod \"redhat-marketplace-4kpsb\" (UID: \"484933df-fe17-42ec-99da-d1187d674051\") " pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.484971 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpp8l\" (UniqueName: \"kubernetes.io/projected/484933df-fe17-42ec-99da-d1187d674051-kube-api-access-gpp8l\") pod \"redhat-marketplace-4kpsb\" (UID: \"484933df-fe17-42ec-99da-d1187d674051\") " pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.485137 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/484933df-fe17-42ec-99da-d1187d674051-utilities\") pod \"redhat-marketplace-4kpsb\" (UID: \"484933df-fe17-42ec-99da-d1187d674051\") " pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.485726 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/484933df-fe17-42ec-99da-d1187d674051-utilities\") pod \"redhat-marketplace-4kpsb\" (UID: \"484933df-fe17-42ec-99da-d1187d674051\") " pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:55:52 crc kubenswrapper[4580]: E0321 04:55:52.485854 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:52.985832239 +0000 UTC m=+258.068415867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.486160 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/484933df-fe17-42ec-99da-d1187d674051-catalog-content\") pod \"redhat-marketplace-4kpsb\" (UID: \"484933df-fe17-42ec-99da-d1187d674051\") " pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.586830 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:52 crc kubenswrapper[4580]: E0321 04:55:52.587309 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:53.087293221 +0000 UTC m=+258.169876849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.637056 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-47qgx"] Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.638156 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.698215 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpp8l\" (UniqueName: \"kubernetes.io/projected/484933df-fe17-42ec-99da-d1187d674051-kube-api-access-gpp8l\") pod \"redhat-marketplace-4kpsb\" (UID: \"484933df-fe17-42ec-99da-d1187d674051\") " pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.699597 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:52 crc kubenswrapper[4580]: E0321 04:55:52.700121 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:53.200092825 +0000 UTC m=+258.282676463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.803939 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dgk5\" (UniqueName: \"kubernetes.io/projected/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-kube-api-access-9dgk5\") pod \"redhat-marketplace-47qgx\" (UID: \"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d\") " pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.804019 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.804068 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-catalog-content\") pod \"redhat-marketplace-47qgx\" (UID: \"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d\") " pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.804105 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-utilities\") pod \"redhat-marketplace-47qgx\" (UID: \"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d\") " pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:55:52 crc kubenswrapper[4580]: E0321 04:55:52.804468 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:53.304452109 +0000 UTC m=+258.387035737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.832895 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-47qgx"] Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.882425 4580 patch_prober.go:28] interesting pod/apiserver-76f77b778f-fshpg container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 21 04:55:52 crc kubenswrapper[4580]: [+]log ok Mar 21 04:55:52 crc kubenswrapper[4580]: [+]etcd ok Mar 21 04:55:52 crc kubenswrapper[4580]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 21 04:55:52 crc kubenswrapper[4580]: [+]poststarthook/generic-apiserver-start-informers ok Mar 21 04:55:52 crc kubenswrapper[4580]: [+]poststarthook/max-in-flight-filter ok Mar 21 04:55:52 crc kubenswrapper[4580]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 21 04:55:52 crc kubenswrapper[4580]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 21 04:55:52 crc kubenswrapper[4580]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 21 04:55:52 crc kubenswrapper[4580]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 21 04:55:52 crc kubenswrapper[4580]: [+]poststarthook/project.openshift.io-projectcache ok Mar 21 04:55:52 crc kubenswrapper[4580]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 21 04:55:52 crc kubenswrapper[4580]: [+]poststarthook/openshift.io-startinformers ok Mar 21 04:55:52 crc kubenswrapper[4580]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 21 04:55:52 crc kubenswrapper[4580]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 21 04:55:52 crc kubenswrapper[4580]: livez check failed Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.882520 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-fshpg" podUID="1b720ad7-2de4-43c9-bab3-81c68d5dfde7" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.906537 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.906768 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-utilities\") pod \"redhat-marketplace-47qgx\" (UID: \"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d\") " pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.906857 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dgk5\" (UniqueName: \"kubernetes.io/projected/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-kube-api-access-9dgk5\") pod \"redhat-marketplace-47qgx\" (UID: \"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d\") " pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.906943 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-catalog-content\") pod \"redhat-marketplace-47qgx\" (UID: \"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d\") " pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.907502 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-catalog-content\") pod \"redhat-marketplace-47qgx\" (UID: \"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d\") " pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:55:52 crc kubenswrapper[4580]: E0321 04:55:52.907618 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:53.407594753 +0000 UTC m=+258.490178381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.907896 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-utilities\") pod \"redhat-marketplace-47qgx\" (UID: \"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d\") " pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.949205 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-77nmx"] Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.967774 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6ppjj"] Mar 21 04:55:52 crc kubenswrapper[4580]: I0321 04:55:52.982259 4580 ???:1] "http: TLS handshake error from 192.168.126.11:57354: no serving certificate available for the kubelet" Mar 21 04:55:53 crc kubenswrapper[4580]: W0321 04:55:53.006218 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc002830b_7ac1_4912_9b31_bad37ac63104.slice/crio-6a86b52069b079e0be51527b39057dc1b5d36e6ee6dfbdbb3dda3c50b9ab1160 WatchSource:0}: Error finding container 6a86b52069b079e0be51527b39057dc1b5d36e6ee6dfbdbb3dda3c50b9ab1160: Status 404 returned error can't find the container with id 6a86b52069b079e0be51527b39057dc1b5d36e6ee6dfbdbb3dda3c50b9ab1160 Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.007983 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:53 crc kubenswrapper[4580]: E0321 04:55:53.008513 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:53.50849559 +0000 UTC m=+258.591079218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.038167 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.039241 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.053634 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:53 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:53 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:53 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.053705 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.056239 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dgk5\" (UniqueName: \"kubernetes.io/projected/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-kube-api-access-9dgk5\") pod \"redhat-marketplace-47qgx\" (UID: \"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d\") " pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.102048 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.102319 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.111900 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:53 crc kubenswrapper[4580]: E0321 04:55:53.112585 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:53.612553727 +0000 UTC m=+258.695137355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.167760 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.206977 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ns8gg"] Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.214654 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.214749 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0466a8f-6c08-4548-8a9b-2f55defbeec0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c0466a8f-6c08-4548-8a9b-2f55defbeec0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.214808 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0466a8f-6c08-4548-8a9b-2f55defbeec0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c0466a8f-6c08-4548-8a9b-2f55defbeec0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:55:53 crc kubenswrapper[4580]: E0321 04:55:53.215227 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:53.715207891 +0000 UTC m=+258.797791529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.216164 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.271200 4580 generic.go:334] "Generic (PLEG): container finished" podID="82874992-faa8-4c73-955b-ffe5f02726a7" containerID="7ddd64ac0170369ce14acc7b7a9dbf7864d4b0a7c9c40255d2987b2e1bdf9a65" exitCode=0 Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.271853 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n99sq" event={"ID":"82874992-faa8-4c73-955b-ffe5f02726a7","Type":"ContainerDied","Data":"7ddd64ac0170369ce14acc7b7a9dbf7864d4b0a7c9c40255d2987b2e1bdf9a65"} Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.276866 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77nmx" event={"ID":"37b3e873-7ca5-4413-9998-6aaf824d6cd7","Type":"ContainerStarted","Data":"024cf1e5452c36fa48642b9243e039270f6e1de9264d722387e731b466b35c75"} Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.289049 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ppjj" event={"ID":"c002830b-7ac1-4912-9b31-bad37ac63104","Type":"ContainerStarted","Data":"6a86b52069b079e0be51527b39057dc1b5d36e6ee6dfbdbb3dda3c50b9ab1160"} Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.322446 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.322990 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0466a8f-6c08-4548-8a9b-2f55defbeec0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c0466a8f-6c08-4548-8a9b-2f55defbeec0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.323077 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-catalog-content\") pod \"redhat-operators-ns8gg\" (UID: \"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c\") " pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.323105 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0466a8f-6c08-4548-8a9b-2f55defbeec0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c0466a8f-6c08-4548-8a9b-2f55defbeec0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.323149 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-utilities\") pod \"redhat-operators-ns8gg\" (UID: \"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c\") " pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.323259 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmlgk\" (UniqueName: \"kubernetes.io/projected/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-kube-api-access-kmlgk\") pod \"redhat-operators-ns8gg\" (UID: \"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c\") " pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:55:53 crc kubenswrapper[4580]: E0321 04:55:53.323528 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:53.823504147 +0000 UTC m=+258.906087775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.323582 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0466a8f-6c08-4548-8a9b-2f55defbeec0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c0466a8f-6c08-4548-8a9b-2f55defbeec0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.329003 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq5t4" event={"ID":"1dd3cd12-741f-4993-8b39-994545e15c2c","Type":"ContainerStarted","Data":"789d58bcf3d2d044bfa0970062663e0129d03a5957f942a94f9f6b8687441a4e"} Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.329340 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq5t4" event={"ID":"1dd3cd12-741f-4993-8b39-994545e15c2c","Type":"ContainerStarted","Data":"65fef62934a61a81e47b3d3e78fbbbd999f553dd7b2e75581edcd2571b9e5b48"} Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.330496 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.379701 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ns8gg"] Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.396276 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c"] Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.424933 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-utilities\") pod \"redhat-operators-ns8gg\" (UID: \"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c\") " pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.425038 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.425080 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmlgk\" (UniqueName: \"kubernetes.io/projected/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-kube-api-access-kmlgk\") pod \"redhat-operators-ns8gg\" (UID: \"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c\") " pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.425197 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-catalog-content\") pod \"redhat-operators-ns8gg\" (UID: \"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c\") " pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.426426 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-utilities\") pod \"redhat-operators-ns8gg\" (UID: \"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c\") " pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:55:53 crc kubenswrapper[4580]: E0321 04:55:53.427920 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:53.927899646 +0000 UTC m=+259.010483274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.429481 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-catalog-content\") pod \"redhat-operators-ns8gg\" (UID: \"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c\") " pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.458483 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0466a8f-6c08-4548-8a9b-2f55defbeec0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c0466a8f-6c08-4548-8a9b-2f55defbeec0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.527154 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:53 crc kubenswrapper[4580]: E0321 04:55:53.527713 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:54.027659668 +0000 UTC m=+259.110243296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.527866 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:53 crc kubenswrapper[4580]: E0321 04:55:53.528886 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:54.028873735 +0000 UTC m=+259.111457363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.559747 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmlgk\" (UniqueName: \"kubernetes.io/projected/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-kube-api-access-kmlgk\") pod \"redhat-operators-ns8gg\" (UID: \"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c\") " pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.604745 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kk5nv"] Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.606838 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.632221 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.632426 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.632462 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs\") pod \"network-metrics-daemon-fpb6h\" (UID: \"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\") " pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.632536 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.632560 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.632634 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:53 crc kubenswrapper[4580]: E0321 04:55:53.632793 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:54.132752629 +0000 UTC m=+259.215336257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.634666 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.642358 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.642939 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.659388 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7-metrics-certs\") pod \"network-metrics-daemon-fpb6h\" (UID: \"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7\") " pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.660542 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.686093 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.718258 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kk5nv"] Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.733874 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.733994 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnx45\" (UniqueName: \"kubernetes.io/projected/9940b0fa-e788-4da2-af4f-da4cdc60f12d-kube-api-access-vnx45\") pod \"redhat-operators-kk5nv\" (UID: \"9940b0fa-e788-4da2-af4f-da4cdc60f12d\") " pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.734023 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9940b0fa-e788-4da2-af4f-da4cdc60f12d-utilities\") pod \"redhat-operators-kk5nv\" (UID: \"9940b0fa-e788-4da2-af4f-da4cdc60f12d\") " pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.734084 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9940b0fa-e788-4da2-af4f-da4cdc60f12d-catalog-content\") pod \"redhat-operators-kk5nv\" (UID: \"9940b0fa-e788-4da2-af4f-da4cdc60f12d\") " pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:55:53 crc kubenswrapper[4580]: E0321 04:55:53.734531 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:54.23451201 +0000 UTC m=+259.317095638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.772283 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.773448 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.782806 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.837845 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.838692 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.839173 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnx45\" (UniqueName: \"kubernetes.io/projected/9940b0fa-e788-4da2-af4f-da4cdc60f12d-kube-api-access-vnx45\") pod \"redhat-operators-kk5nv\" (UID: \"9940b0fa-e788-4da2-af4f-da4cdc60f12d\") " pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.839208 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9940b0fa-e788-4da2-af4f-da4cdc60f12d-utilities\") pod \"redhat-operators-kk5nv\" (UID: \"9940b0fa-e788-4da2-af4f-da4cdc60f12d\") " pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.839279 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9940b0fa-e788-4da2-af4f-da4cdc60f12d-catalog-content\") pod \"redhat-operators-kk5nv\" (UID: \"9940b0fa-e788-4da2-af4f-da4cdc60f12d\") " pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.839906 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9940b0fa-e788-4da2-af4f-da4cdc60f12d-catalog-content\") pod \"redhat-operators-kk5nv\" (UID: \"9940b0fa-e788-4da2-af4f-da4cdc60f12d\") " pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:55:53 crc kubenswrapper[4580]: E0321 04:55:53.840042 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:54.340011553 +0000 UTC m=+259.422595181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.840197 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9940b0fa-e788-4da2-af4f-da4cdc60f12d-utilities\") pod \"redhat-operators-kk5nv\" (UID: \"9940b0fa-e788-4da2-af4f-da4cdc60f12d\") " pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.843289 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.846273 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpb6h" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.857736 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.866162 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.926687 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnx45\" (UniqueName: \"kubernetes.io/projected/9940b0fa-e788-4da2-af4f-da4cdc60f12d-kube-api-access-vnx45\") pod \"redhat-operators-kk5nv\" (UID: \"9940b0fa-e788-4da2-af4f-da4cdc60f12d\") " pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.941503 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:53 crc kubenswrapper[4580]: E0321 04:55:53.941982 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:54.44196579 +0000 UTC m=+259.524549418 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:53 crc kubenswrapper[4580]: I0321 04:55:53.987172 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.046550 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:54 crc kubenswrapper[4580]: E0321 04:55:54.047633 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:54.547605947 +0000 UTC m=+259.630189575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.051845 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:54 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:54 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:54 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.051905 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.151185 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:54 crc kubenswrapper[4580]: E0321 04:55:54.151565 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:54.651546943 +0000 UTC m=+259.734130571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.252668 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:54 crc kubenswrapper[4580]: E0321 04:55:54.253065 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:54.753046437 +0000 UTC m=+259.835630065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.355114 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:54 crc kubenswrapper[4580]: E0321 04:55:54.355651 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:54.855630803 +0000 UTC m=+259.938214431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.437926 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" event={"ID":"e7a9eebb-7ea8-4e16-9d70-f35fb12df177","Type":"ContainerStarted","Data":"e539b2f8412a7a47e33d6e940c274c35925df7c586f3c0e52b812d650ad63369"} Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.439979 4580 patch_prober.go:28] interesting pod/apiserver-76f77b778f-fshpg container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 21 04:55:54 crc kubenswrapper[4580]: [+]log ok Mar 21 04:55:54 crc kubenswrapper[4580]: [+]etcd ok Mar 21 04:55:54 crc kubenswrapper[4580]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 21 04:55:54 crc kubenswrapper[4580]: [+]poststarthook/generic-apiserver-start-informers ok Mar 21 04:55:54 crc kubenswrapper[4580]: [+]poststarthook/max-in-flight-filter ok Mar 21 04:55:54 crc kubenswrapper[4580]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 21 04:55:54 crc kubenswrapper[4580]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 21 04:55:54 crc kubenswrapper[4580]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 21 04:55:54 crc kubenswrapper[4580]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 21 04:55:54 crc kubenswrapper[4580]: [+]poststarthook/project.openshift.io-projectcache ok Mar 21 04:55:54 crc kubenswrapper[4580]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 21 04:55:54 crc kubenswrapper[4580]: [+]poststarthook/openshift.io-startinformers ok Mar 21 04:55:54 crc kubenswrapper[4580]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 21 04:55:54 crc kubenswrapper[4580]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 21 04:55:54 crc kubenswrapper[4580]: livez check failed Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.440033 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-fshpg" podUID="1b720ad7-2de4-43c9-bab3-81c68d5dfde7" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.461354 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:54 crc kubenswrapper[4580]: E0321 04:55:54.461696 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:54.961675412 +0000 UTC m=+260.044259040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.493915 4580 generic.go:334] "Generic (PLEG): container finished" podID="1dd3cd12-741f-4993-8b39-994545e15c2c" containerID="789d58bcf3d2d044bfa0970062663e0129d03a5957f942a94f9f6b8687441a4e" exitCode=0 Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.493987 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq5t4" event={"ID":"1dd3cd12-741f-4993-8b39-994545e15c2c","Type":"ContainerDied","Data":"789d58bcf3d2d044bfa0970062663e0129d03a5957f942a94f9f6b8687441a4e"} Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.495549 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" event={"ID":"8ba19f04-7cc2-44da-bb2d-cd4200fce325","Type":"ContainerStarted","Data":"bc84a9c67686b5e71d1d670f234291690a13c0dce766750bd70ff8779ec5713b"} Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.495579 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" event={"ID":"8ba19f04-7cc2-44da-bb2d-cd4200fce325","Type":"ContainerStarted","Data":"235a765b2e7f702b7ec607fa072761fe656a2bca6e1c6830b822475e0e402ab0"} Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.496614 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.552195 4580 generic.go:334] "Generic (PLEG): container finished" podID="37b3e873-7ca5-4413-9998-6aaf824d6cd7" containerID="bd7ce93da37447c55a0d714fb6aae1b9dddea5562eacf49408b0c54a19a322f2" exitCode=0 Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.552343 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77nmx" event={"ID":"37b3e873-7ca5-4413-9998-6aaf824d6cd7","Type":"ContainerDied","Data":"bd7ce93da37447c55a0d714fb6aae1b9dddea5562eacf49408b0c54a19a322f2"} Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.565186 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:54 crc kubenswrapper[4580]: E0321 04:55:54.565650 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:55.065634819 +0000 UTC m=+260.148218447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.608217 4580 generic.go:334] "Generic (PLEG): container finished" podID="c002830b-7ac1-4912-9b31-bad37ac63104" containerID="7677ebf5ff4007868f4233ce112433a7606c14c1631642fc51ddc6acdf743d4c" exitCode=0 Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.608268 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ppjj" event={"ID":"c002830b-7ac1-4912-9b31-bad37ac63104","Type":"ContainerDied","Data":"7677ebf5ff4007868f4233ce112433a7606c14c1631642fc51ddc6acdf743d4c"} Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.666188 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:54 crc kubenswrapper[4580]: E0321 04:55:54.668566 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:55.168546274 +0000 UTC m=+260.251129902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.669721 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" podStartSLOduration=9.669697619 podStartE2EDuration="9.669697619s" podCreationTimestamp="2026-03-21 04:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:54.599379611 +0000 UTC m=+259.681963259" watchObservedRunningTime="2026-03-21 04:55:54.669697619 +0000 UTC m=+259.752281247" Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.768043 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:54 crc kubenswrapper[4580]: E0321 04:55:54.768816 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:55.26880308 +0000 UTC m=+260.351386708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.865430 4580 ???:1] "http: TLS handshake error from 192.168.126.11:57360: no serving certificate available for the kubelet" Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.870959 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:54 crc kubenswrapper[4580]: E0321 04:55:54.871437 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:55.371418097 +0000 UTC m=+260.454001725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:54 crc kubenswrapper[4580]: I0321 04:55:54.976229 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:54 crc kubenswrapper[4580]: E0321 04:55:54.977084 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:55.477065774 +0000 UTC m=+260.559649402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.078009 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:55 crc kubenswrapper[4580]: E0321 04:55:55.078458 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:55.578410183 +0000 UTC m=+260.660993811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.126680 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.127531 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.133488 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.137348 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.172065 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.181112 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:55 crc kubenswrapper[4580]: E0321 04:55:55.184022 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:55.683999069 +0000 UTC m=+260.766582697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.185412 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:55 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:55 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:55 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.185495 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.244478 4580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.283404 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.283912 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2bd0ccba-7716-4511-9291-93441ca57053-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2bd0ccba-7716-4511-9291-93441ca57053\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.284350 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2bd0ccba-7716-4511-9291-93441ca57053-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2bd0ccba-7716-4511-9291-93441ca57053\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:55:55 crc kubenswrapper[4580]: E0321 04:55:55.284687 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:55.784656257 +0000 UTC m=+260.867239895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.386041 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2bd0ccba-7716-4511-9291-93441ca57053-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2bd0ccba-7716-4511-9291-93441ca57053\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.386128 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2bd0ccba-7716-4511-9291-93441ca57053-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2bd0ccba-7716-4511-9291-93441ca57053\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.386194 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:55 crc kubenswrapper[4580]: E0321 04:55:55.386739 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:55.886720898 +0000 UTC m=+260.969304536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.387069 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2bd0ccba-7716-4511-9291-93441ca57053-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2bd0ccba-7716-4511-9291-93441ca57053\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.456002 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2bd0ccba-7716-4511-9291-93441ca57053-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2bd0ccba-7716-4511-9291-93441ca57053\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.456346 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.488091 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:55 crc kubenswrapper[4580]: E0321 04:55:55.489019 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:55.988998825 +0000 UTC m=+261.071582453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.501085 4580 patch_prober.go:28] interesting pod/route-controller-manager-d6d44c798-fvr5c container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.501154 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" podUID="8ba19f04-7cc2-44da-bb2d-cd4200fce325" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.594586 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:55 crc kubenswrapper[4580]: E0321 04:55:55.595474 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:56.095438375 +0000 UTC m=+261.178022013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.695956 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:55 crc kubenswrapper[4580]: E0321 04:55:55.696331 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:56.19630161 +0000 UTC m=+261.278885238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.705612 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" event={"ID":"e7a9eebb-7ea8-4e16-9d70-f35fb12df177","Type":"ContainerStarted","Data":"4ea93bfdec38488549acf23c1abf72109578db2780c45f6dd43c84dd2a8c608a"} Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.766247 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-rrbz2" podStartSLOduration=17.766221986 podStartE2EDuration="17.766221986s" podCreationTimestamp="2026-03-21 04:55:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:55.765890246 +0000 UTC m=+260.848473894" watchObservedRunningTime="2026-03-21 04:55:55.766221986 +0000 UTC m=+260.848805634" Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.798518 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:55 crc kubenswrapper[4580]: E0321 04:55:55.800201 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:56.300183105 +0000 UTC m=+261.382766733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.807347 4580 ???:1] "http: TLS handshake error from 192.168.126.11:47080: no serving certificate available for the kubelet" Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.854185 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.894563 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-47qgx"] Mar 21 04:55:55 crc kubenswrapper[4580]: I0321 04:55:55.900757 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:55 crc kubenswrapper[4580]: E0321 04:55:55.901492 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:55:56.401470752 +0000 UTC m=+261.484054380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.002601 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:56 crc kubenswrapper[4580]: E0321 04:55:56.003075 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:55:56.503059169 +0000 UTC m=+261.585642797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bqkqg" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.028335 4580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-21T04:55:55.251241435Z","Handler":null,"Name":""} Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.056132 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:56 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:56 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:56 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.056705 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.096152 4580 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.096190 4580 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.103618 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.179757 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4kpsb"] Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.242555 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.312414 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.394166 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.426517 4580 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.426561 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.685827 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fpb6h"] Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.723819 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kk5nv"] Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.785075 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bqkqg\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.817395 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ns8gg"] Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.853334 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:56 crc kubenswrapper[4580]: W0321 04:55:56.901847 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbff4e8c9_6b8d_44cf_8c34_2e5b2f5e4f3c.slice/crio-a88f19f1b70285467721c81d613fc450142c5bf967175eadb3818a855548af7d WatchSource:0}: Error finding container a88f19f1b70285467721c81d613fc450142c5bf967175eadb3818a855548af7d: Status 404 returned error can't find the container with id a88f19f1b70285467721c81d613fc450142c5bf967175eadb3818a855548af7d Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.906366 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.948580 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7034ddea71837e40d2e78586dc46ac557ee03e81e78ef00f9a7ac865667617aa"} Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.967196 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c0466a8f-6c08-4548-8a9b-2f55defbeec0","Type":"ContainerStarted","Data":"53a41eef1864aebb2d749893099b510d4ae9b673c0c62b98832d40d7f5bb8170"} Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.975129 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" event={"ID":"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7","Type":"ContainerStarted","Data":"bcef027790613a7e023ff2d73a4eb43389924cbfd2d7fe28400ac4b6af5b17f8"} Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.981133 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e9ac060af807081490986187f92d54476763c1ae48de21c4f6e849673ef0393f"} Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.983062 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kpsb" event={"ID":"484933df-fe17-42ec-99da-d1187d674051","Type":"ContainerStarted","Data":"9fec92962e0748371e579a57393f153a9f931600cc6de8afa90c515e13e2c767"} Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.984158 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3f60a3c915ebe9f03c34086cf6e3438ca1960839e6aca8fc8ea9d0edb8bb9c80"} Mar 21 04:55:56 crc kubenswrapper[4580]: I0321 04:55:56.988659 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47qgx" event={"ID":"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d","Type":"ContainerStarted","Data":"ec8bbfbc91e4559923ec364b21f3d4ff175387916004e2566c82486ddde49cb1"} Mar 21 04:55:57 crc kubenswrapper[4580]: I0321 04:55:57.050927 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:57 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:57 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:57 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:57 crc kubenswrapper[4580]: I0321 04:55:57.050994 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:57 crc kubenswrapper[4580]: W0321 04:55:57.162565 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2bd0ccba_7716_4511_9291_93441ca57053.slice/crio-68efcf739c262f5e2dac40feb08bc4ea4bf840525beff57ad5201f29aba0b889 WatchSource:0}: Error finding container 68efcf739c262f5e2dac40feb08bc4ea4bf840525beff57ad5201f29aba0b889: Status 404 returned error can't find the container with id 68efcf739c262f5e2dac40feb08bc4ea4bf840525beff57ad5201f29aba0b889 Mar 21 04:55:57 crc kubenswrapper[4580]: I0321 04:55:57.663480 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.043494 4580 generic.go:334] "Generic (PLEG): container finished" podID="bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" containerID="de51185f02be9d281bb28d1cbb8576d2fd4cd3d333b5c8fe87f90731f8049eb7" exitCode=0 Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.043875 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47qgx" event={"ID":"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d","Type":"ContainerDied","Data":"de51185f02be9d281bb28d1cbb8576d2fd4cd3d333b5c8fe87f90731f8049eb7"} Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.057122 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:58 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:58 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:58 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.057174 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.069362 4580 generic.go:334] "Generic (PLEG): container finished" podID="bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" containerID="a30f7c887f16080c0beefee2cd736ea22bf42f8eada48466c4890bb0c6c48275" exitCode=0 Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.069458 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ns8gg" event={"ID":"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c","Type":"ContainerDied","Data":"a30f7c887f16080c0beefee2cd736ea22bf42f8eada48466c4890bb0c6c48275"} Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.069486 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ns8gg" event={"ID":"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c","Type":"ContainerStarted","Data":"a88f19f1b70285467721c81d613fc450142c5bf967175eadb3818a855548af7d"} Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.087185 4580 generic.go:334] "Generic (PLEG): container finished" podID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" containerID="1cd15adb1c59d9d1b28ee18dda8dc78af7c5242ef08704cbdc0b2f50fcb364a7" exitCode=0 Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.087347 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kk5nv" event={"ID":"9940b0fa-e788-4da2-af4f-da4cdc60f12d","Type":"ContainerDied","Data":"1cd15adb1c59d9d1b28ee18dda8dc78af7c5242ef08704cbdc0b2f50fcb364a7"} Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.087404 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kk5nv" event={"ID":"9940b0fa-e788-4da2-af4f-da4cdc60f12d","Type":"ContainerStarted","Data":"0902f30bc0fdb7c2d6d72fd627e107268e9b1ffa1848e962702c70cc3d37f7e9"} Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.089872 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2bd0ccba-7716-4511-9291-93441ca57053","Type":"ContainerStarted","Data":"68efcf739c262f5e2dac40feb08bc4ea4bf840525beff57ad5201f29aba0b889"} Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.106111 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fe618e278ac84c2f3b6c37dda4a67a8df331f85c853a84ffaef8ac201237554e"} Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.111350 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"27df8954c5f737a0801c0d78792d63175481ba45a1810dbdefff64758c69ab02"} Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.111831 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.130412 4580 generic.go:334] "Generic (PLEG): container finished" podID="92b13d13-f88e-47cc-8815-34b54fd68711" containerID="a6be6f616e65ef14a7724f8ea93ae87aaa63f9e0f48e3f37e0b4c8261348e254" exitCode=0 Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.130519 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" event={"ID":"92b13d13-f88e-47cc-8815-34b54fd68711","Type":"ContainerDied","Data":"a6be6f616e65ef14a7724f8ea93ae87aaa63f9e0f48e3f37e0b4c8261348e254"} Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.133703 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bqkqg"] Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.136087 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b8b3bcdc006477cb385278e9162f9fceb7440c098e5b2c6b93b5a561cbc363f9"} Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.158089 4580 generic.go:334] "Generic (PLEG): container finished" podID="484933df-fe17-42ec-99da-d1187d674051" containerID="5ec157e076f2e1e8f1f17c3ddf48f934757a7ab4c6782b6ad836e57f9fbc65a1" exitCode=0 Mar 21 04:55:58 crc kubenswrapper[4580]: I0321 04:55:58.158163 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kpsb" event={"ID":"484933df-fe17-42ec-99da-d1187d674051","Type":"ContainerDied","Data":"5ec157e076f2e1e8f1f17c3ddf48f934757a7ab4c6782b6ad836e57f9fbc65a1"} Mar 21 04:55:59 crc kubenswrapper[4580]: I0321 04:55:59.058104 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:55:59 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:55:59 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:55:59 crc kubenswrapper[4580]: healthz check failed Mar 21 04:55:59 crc kubenswrapper[4580]: I0321 04:55:59.058955 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:55:59 crc kubenswrapper[4580]: I0321 04:55:59.183051 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9csml" Mar 21 04:55:59 crc kubenswrapper[4580]: I0321 04:55:59.190911 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" event={"ID":"621054bf-a821-4811-b7b6-5b7d011b8a05","Type":"ContainerStarted","Data":"d04037254d9d5d58ed804eb3977d23d467cd29678ca601d731a677709c6edf5c"} Mar 21 04:55:59 crc kubenswrapper[4580]: I0321 04:55:59.191059 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" event={"ID":"621054bf-a821-4811-b7b6-5b7d011b8a05","Type":"ContainerStarted","Data":"579090e3e5cc87a5994b081dd62b03085b7e0f0ca53e3e15e97ec0ca2ce43402"} Mar 21 04:55:59 crc kubenswrapper[4580]: I0321 04:55:59.192317 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:55:59 crc kubenswrapper[4580]: I0321 04:55:59.240513 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" event={"ID":"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7","Type":"ContainerStarted","Data":"534cdc67ba5b45f06e74e9a1fd3967e1dc3de56e28d38ce72fe8bdc7b6a07556"} Mar 21 04:55:59 crc kubenswrapper[4580]: I0321 04:55:59.251102 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" podStartSLOduration=204.251043239 podStartE2EDuration="3m24.251043239s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:59.242630189 +0000 UTC m=+264.325213847" watchObservedRunningTime="2026-03-21 04:55:59.251043239 +0000 UTC m=+264.333626897" Mar 21 04:55:59 crc kubenswrapper[4580]: I0321 04:55:59.297868 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2bd0ccba-7716-4511-9291-93441ca57053","Type":"ContainerStarted","Data":"6d0ed14b54aa88d7ae885738a19e90d639d9299eb5d15111f52398eca516819e"} Mar 21 04:55:59 crc kubenswrapper[4580]: I0321 04:55:59.314840 4580 generic.go:334] "Generic (PLEG): container finished" podID="c0466a8f-6c08-4548-8a9b-2f55defbeec0" containerID="2e7bcd4d44dc1ee0bffd9cd106f793096d4c881fc2d7339af9de9627518815fa" exitCode=0 Mar 21 04:55:59 crc kubenswrapper[4580]: I0321 04:55:59.315019 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c0466a8f-6c08-4548-8a9b-2f55defbeec0","Type":"ContainerDied","Data":"2e7bcd4d44dc1ee0bffd9cd106f793096d4c881fc2d7339af9de9627518815fa"} Mar 21 04:55:59 crc kubenswrapper[4580]: I0321 04:55:59.366736 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=4.366702733 podStartE2EDuration="4.366702733s" podCreationTimestamp="2026-03-21 04:55:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:55:59.333901879 +0000 UTC m=+264.416485537" watchObservedRunningTime="2026-03-21 04:55:59.366702733 +0000 UTC m=+264.449286361" Mar 21 04:55:59 crc kubenswrapper[4580]: I0321 04:55:59.436603 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:59 crc kubenswrapper[4580]: I0321 04:55:59.458863 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-fshpg" Mar 21 04:55:59 crc kubenswrapper[4580]: I0321 04:55:59.915372 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.028478 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92b13d13-f88e-47cc-8815-34b54fd68711-config-volume\") pod \"92b13d13-f88e-47cc-8815-34b54fd68711\" (UID: \"92b13d13-f88e-47cc-8815-34b54fd68711\") " Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.028593 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nlqt\" (UniqueName: \"kubernetes.io/projected/92b13d13-f88e-47cc-8815-34b54fd68711-kube-api-access-8nlqt\") pod \"92b13d13-f88e-47cc-8815-34b54fd68711\" (UID: \"92b13d13-f88e-47cc-8815-34b54fd68711\") " Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.028765 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92b13d13-f88e-47cc-8815-34b54fd68711-secret-volume\") pod \"92b13d13-f88e-47cc-8815-34b54fd68711\" (UID: \"92b13d13-f88e-47cc-8815-34b54fd68711\") " Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.037969 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92b13d13-f88e-47cc-8815-34b54fd68711-config-volume" (OuterVolumeSpecName: "config-volume") pod "92b13d13-f88e-47cc-8815-34b54fd68711" (UID: "92b13d13-f88e-47cc-8815-34b54fd68711"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.060675 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92b13d13-f88e-47cc-8815-34b54fd68711-kube-api-access-8nlqt" (OuterVolumeSpecName: "kube-api-access-8nlqt") pod "92b13d13-f88e-47cc-8815-34b54fd68711" (UID: "92b13d13-f88e-47cc-8815-34b54fd68711"). InnerVolumeSpecName "kube-api-access-8nlqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.064715 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:00 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:00 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:00 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.064839 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.077709 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92b13d13-f88e-47cc-8815-34b54fd68711-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "92b13d13-f88e-47cc-8815-34b54fd68711" (UID: "92b13d13-f88e-47cc-8815-34b54fd68711"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.130564 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nlqt\" (UniqueName: \"kubernetes.io/projected/92b13d13-f88e-47cc-8815-34b54fd68711-kube-api-access-8nlqt\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.130625 4580 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92b13d13-f88e-47cc-8815-34b54fd68711-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.130641 4580 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92b13d13-f88e-47cc-8815-34b54fd68711-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.160089 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567816-m2qj9"] Mar 21 04:56:00 crc kubenswrapper[4580]: E0321 04:56:00.160444 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b13d13-f88e-47cc-8815-34b54fd68711" containerName="collect-profiles" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.160469 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b13d13-f88e-47cc-8815-34b54fd68711" containerName="collect-profiles" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.160639 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="92b13d13-f88e-47cc-8815-34b54fd68711" containerName="collect-profiles" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.161201 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-m2qj9" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.161824 4580 patch_prober.go:28] interesting pod/console-f9d7485db-48dqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.161870 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-48dqz" podUID="bcade120-6711-4045-9149-08985699febd" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.165329 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.235907 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpbx7\" (UniqueName: \"kubernetes.io/projected/0cb67fb0-cbe3-47cd-9029-f54e6e74729d-kube-api-access-jpbx7\") pod \"auto-csr-approver-29567816-m2qj9\" (UID: \"0cb67fb0-cbe3-47cd-9029-f54e6e74729d\") " pod="openshift-infra/auto-csr-approver-29567816-m2qj9" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.253757 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-m2qj9"] Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.271150 4580 patch_prober.go:28] interesting pod/downloads-7954f5f757-4ffj8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.271220 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.271234 4580 patch_prober.go:28] interesting pod/downloads-7954f5f757-4ffj8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.271344 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.337373 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpbx7\" (UniqueName: \"kubernetes.io/projected/0cb67fb0-cbe3-47cd-9029-f54e6e74729d-kube-api-access-jpbx7\") pod \"auto-csr-approver-29567816-m2qj9\" (UID: \"0cb67fb0-cbe3-47cd-9029-f54e6e74729d\") " pod="openshift-infra/auto-csr-approver-29567816-m2qj9" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.339329 4580 generic.go:334] "Generic (PLEG): container finished" podID="2bd0ccba-7716-4511-9291-93441ca57053" containerID="6d0ed14b54aa88d7ae885738a19e90d639d9299eb5d15111f52398eca516819e" exitCode=0 Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.339422 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2bd0ccba-7716-4511-9291-93441ca57053","Type":"ContainerDied","Data":"6d0ed14b54aa88d7ae885738a19e90d639d9299eb5d15111f52398eca516819e"} Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.367899 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpbx7\" (UniqueName: \"kubernetes.io/projected/0cb67fb0-cbe3-47cd-9029-f54e6e74729d-kube-api-access-jpbx7\") pod \"auto-csr-approver-29567816-m2qj9\" (UID: \"0cb67fb0-cbe3-47cd-9029-f54e6e74729d\") " pod="openshift-infra/auto-csr-approver-29567816-m2qj9" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.382514 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" event={"ID":"92b13d13-f88e-47cc-8815-34b54fd68711","Type":"ContainerDied","Data":"e7df0b6c67b0beee27bb469cede347f1cf133acaba77c61b28df3537000213a9"} Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.382592 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7df0b6c67b0beee27bb469cede347f1cf133acaba77c61b28df3537000213a9" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.382597 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.385047 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fpb6h" event={"ID":"ea2ad066-6f3b-4e1d-84e4-7eb9af8ca7f7","Type":"ContainerStarted","Data":"55a57d7b34bfb6fb2d47cf97cdf1e53fa00b107b1f94049a35ebe55dfcec7e29"} Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.425316 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fpb6h" podStartSLOduration=205.425293765 podStartE2EDuration="3m25.425293765s" podCreationTimestamp="2026-03-21 04:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:56:00.406862498 +0000 UTC m=+265.489446146" watchObservedRunningTime="2026-03-21 04:56:00.425293765 +0000 UTC m=+265.507877393" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.511566 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-m2qj9" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.851538 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.951936 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0466a8f-6c08-4548-8a9b-2f55defbeec0-kube-api-access\") pod \"c0466a8f-6c08-4548-8a9b-2f55defbeec0\" (UID: \"c0466a8f-6c08-4548-8a9b-2f55defbeec0\") " Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.952040 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0466a8f-6c08-4548-8a9b-2f55defbeec0-kubelet-dir\") pod \"c0466a8f-6c08-4548-8a9b-2f55defbeec0\" (UID: \"c0466a8f-6c08-4548-8a9b-2f55defbeec0\") " Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.952993 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0466a8f-6c08-4548-8a9b-2f55defbeec0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c0466a8f-6c08-4548-8a9b-2f55defbeec0" (UID: "c0466a8f-6c08-4548-8a9b-2f55defbeec0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.953824 4580 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0466a8f-6c08-4548-8a9b-2f55defbeec0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:00 crc kubenswrapper[4580]: I0321 04:56:00.984968 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0466a8f-6c08-4548-8a9b-2f55defbeec0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c0466a8f-6c08-4548-8a9b-2f55defbeec0" (UID: "c0466a8f-6c08-4548-8a9b-2f55defbeec0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:01 crc kubenswrapper[4580]: I0321 04:56:01.052657 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:01 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:01 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:01 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:01 crc kubenswrapper[4580]: I0321 04:56:01.052712 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:01 crc kubenswrapper[4580]: I0321 04:56:01.056345 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0466a8f-6c08-4548-8a9b-2f55defbeec0-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:01 crc kubenswrapper[4580]: I0321 04:56:01.061357 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" Mar 21 04:56:01 crc kubenswrapper[4580]: I0321 04:56:01.125827 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rsrfm" Mar 21 04:56:01 crc kubenswrapper[4580]: I0321 04:56:01.207749 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-m2qj9"] Mar 21 04:56:01 crc kubenswrapper[4580]: I0321 04:56:01.450355 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c0466a8f-6c08-4548-8a9b-2f55defbeec0","Type":"ContainerDied","Data":"53a41eef1864aebb2d749893099b510d4ae9b673c0c62b98832d40d7f5bb8170"} Mar 21 04:56:01 crc kubenswrapper[4580]: I0321 04:56:01.450398 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:56:01 crc kubenswrapper[4580]: I0321 04:56:01.450428 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53a41eef1864aebb2d749893099b510d4ae9b673c0c62b98832d40d7f5bb8170" Mar 21 04:56:01 crc kubenswrapper[4580]: I0321 04:56:01.480479 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567816-m2qj9" event={"ID":"0cb67fb0-cbe3-47cd-9029-f54e6e74729d","Type":"ContainerStarted","Data":"715555ab4a05645e44b374807e00f38c00af61d4dbc00b23ee7eda184d2f474c"} Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.052507 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:02 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:02 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:02 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.052903 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.247768 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.315254 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2bd0ccba-7716-4511-9291-93441ca57053-kubelet-dir\") pod \"2bd0ccba-7716-4511-9291-93441ca57053\" (UID: \"2bd0ccba-7716-4511-9291-93441ca57053\") " Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.315442 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2bd0ccba-7716-4511-9291-93441ca57053-kube-api-access\") pod \"2bd0ccba-7716-4511-9291-93441ca57053\" (UID: \"2bd0ccba-7716-4511-9291-93441ca57053\") " Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.323628 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd0ccba-7716-4511-9291-93441ca57053-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2bd0ccba-7716-4511-9291-93441ca57053" (UID: "2bd0ccba-7716-4511-9291-93441ca57053"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.338457 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bd0ccba-7716-4511-9291-93441ca57053-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2bd0ccba-7716-4511-9291-93441ca57053" (UID: "2bd0ccba-7716-4511-9291-93441ca57053"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.426208 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2bd0ccba-7716-4511-9291-93441ca57053-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.426251 4580 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2bd0ccba-7716-4511-9291-93441ca57053-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.586625 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d77679dc6-9xg8t"] Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.586893 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" podUID="f808bccc-3450-447c-8b0a-7909fe189edd" containerName="controller-manager" containerID="cri-o://6720f1dce227ed179bd467b0ce0472d88884ddfb7999e75d935fd53bca12757c" gracePeriod=30 Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.648414 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c"] Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.648665 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" podUID="8ba19f04-7cc2-44da-bb2d-cd4200fce325" containerName="route-controller-manager" containerID="cri-o://bc84a9c67686b5e71d1d670f234291690a13c0dce766750bd70ff8779ec5713b" gracePeriod=30 Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.649188 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.649135 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2bd0ccba-7716-4511-9291-93441ca57053","Type":"ContainerDied","Data":"68efcf739c262f5e2dac40feb08bc4ea4bf840525beff57ad5201f29aba0b889"} Mar 21 04:56:02 crc kubenswrapper[4580]: I0321 04:56:02.649288 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68efcf739c262f5e2dac40feb08bc4ea4bf840525beff57ad5201f29aba0b889" Mar 21 04:56:03 crc kubenswrapper[4580]: I0321 04:56:03.051067 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:03 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:03 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:03 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:03 crc kubenswrapper[4580]: I0321 04:56:03.051598 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:03 crc kubenswrapper[4580]: I0321 04:56:03.716200 4580 generic.go:334] "Generic (PLEG): container finished" podID="f808bccc-3450-447c-8b0a-7909fe189edd" containerID="6720f1dce227ed179bd467b0ce0472d88884ddfb7999e75d935fd53bca12757c" exitCode=0 Mar 21 04:56:03 crc kubenswrapper[4580]: I0321 04:56:03.716298 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" event={"ID":"f808bccc-3450-447c-8b0a-7909fe189edd","Type":"ContainerDied","Data":"6720f1dce227ed179bd467b0ce0472d88884ddfb7999e75d935fd53bca12757c"} Mar 21 04:56:03 crc kubenswrapper[4580]: I0321 04:56:03.736539 4580 generic.go:334] "Generic (PLEG): container finished" podID="8ba19f04-7cc2-44da-bb2d-cd4200fce325" containerID="bc84a9c67686b5e71d1d670f234291690a13c0dce766750bd70ff8779ec5713b" exitCode=0 Mar 21 04:56:03 crc kubenswrapper[4580]: I0321 04:56:03.736624 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" event={"ID":"8ba19f04-7cc2-44da-bb2d-cd4200fce325","Type":"ContainerDied","Data":"bc84a9c67686b5e71d1d670f234291690a13c0dce766750bd70ff8779ec5713b"} Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.022680 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.026529 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.065579 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:04 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:04 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:04 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.065638 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.071160 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6495544757-v6422"] Mar 21 04:56:04 crc kubenswrapper[4580]: E0321 04:56:04.085565 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd0ccba-7716-4511-9291-93441ca57053" containerName="pruner" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.085608 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd0ccba-7716-4511-9291-93441ca57053" containerName="pruner" Mar 21 04:56:04 crc kubenswrapper[4580]: E0321 04:56:04.085623 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f808bccc-3450-447c-8b0a-7909fe189edd" containerName="controller-manager" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.085631 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f808bccc-3450-447c-8b0a-7909fe189edd" containerName="controller-manager" Mar 21 04:56:04 crc kubenswrapper[4580]: E0321 04:56:04.085640 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0466a8f-6c08-4548-8a9b-2f55defbeec0" containerName="pruner" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.085647 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0466a8f-6c08-4548-8a9b-2f55defbeec0" containerName="pruner" Mar 21 04:56:04 crc kubenswrapper[4580]: E0321 04:56:04.085663 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba19f04-7cc2-44da-bb2d-cd4200fce325" containerName="route-controller-manager" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.085669 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba19f04-7cc2-44da-bb2d-cd4200fce325" containerName="route-controller-manager" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.085818 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba19f04-7cc2-44da-bb2d-cd4200fce325" containerName="route-controller-manager" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.085826 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f808bccc-3450-447c-8b0a-7909fe189edd" containerName="controller-manager" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.085832 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd0ccba-7716-4511-9291-93441ca57053" containerName="pruner" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.085841 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0466a8f-6c08-4548-8a9b-2f55defbeec0" containerName="pruner" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.086307 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.099657 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6495544757-v6422"] Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.159870 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f808bccc-3450-447c-8b0a-7909fe189edd-serving-cert\") pod \"f808bccc-3450-447c-8b0a-7909fe189edd\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.160092 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-proxy-ca-bundles\") pod \"f808bccc-3450-447c-8b0a-7909fe189edd\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.160177 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ba19f04-7cc2-44da-bb2d-cd4200fce325-client-ca\") pod \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.160283 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ba19f04-7cc2-44da-bb2d-cd4200fce325-config\") pod \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.160343 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-client-ca\") pod \"f808bccc-3450-447c-8b0a-7909fe189edd\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.160407 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ba19f04-7cc2-44da-bb2d-cd4200fce325-serving-cert\") pod \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.160447 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-config\") pod \"f808bccc-3450-447c-8b0a-7909fe189edd\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.160517 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk8xl\" (UniqueName: \"kubernetes.io/projected/8ba19f04-7cc2-44da-bb2d-cd4200fce325-kube-api-access-mk8xl\") pod \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\" (UID: \"8ba19f04-7cc2-44da-bb2d-cd4200fce325\") " Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.160605 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82zm2\" (UniqueName: \"kubernetes.io/projected/f808bccc-3450-447c-8b0a-7909fe189edd-kube-api-access-82zm2\") pod \"f808bccc-3450-447c-8b0a-7909fe189edd\" (UID: \"f808bccc-3450-447c-8b0a-7909fe189edd\") " Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.160821 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-config\") pod \"route-controller-manager-6495544757-v6422\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.160891 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-client-ca\") pod \"route-controller-manager-6495544757-v6422\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.160930 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgf4r\" (UniqueName: \"kubernetes.io/projected/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-kube-api-access-vgf4r\") pod \"route-controller-manager-6495544757-v6422\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.161069 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-serving-cert\") pod \"route-controller-manager-6495544757-v6422\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.161580 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ba19f04-7cc2-44da-bb2d-cd4200fce325-config" (OuterVolumeSpecName: "config") pod "8ba19f04-7cc2-44da-bb2d-cd4200fce325" (UID: "8ba19f04-7cc2-44da-bb2d-cd4200fce325"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.162038 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-config" (OuterVolumeSpecName: "config") pod "f808bccc-3450-447c-8b0a-7909fe189edd" (UID: "f808bccc-3450-447c-8b0a-7909fe189edd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.162116 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f808bccc-3450-447c-8b0a-7909fe189edd" (UID: "f808bccc-3450-447c-8b0a-7909fe189edd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.163259 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ba19f04-7cc2-44da-bb2d-cd4200fce325-client-ca" (OuterVolumeSpecName: "client-ca") pod "8ba19f04-7cc2-44da-bb2d-cd4200fce325" (UID: "8ba19f04-7cc2-44da-bb2d-cd4200fce325"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.165244 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-client-ca" (OuterVolumeSpecName: "client-ca") pod "f808bccc-3450-447c-8b0a-7909fe189edd" (UID: "f808bccc-3450-447c-8b0a-7909fe189edd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.174469 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f808bccc-3450-447c-8b0a-7909fe189edd-kube-api-access-82zm2" (OuterVolumeSpecName: "kube-api-access-82zm2") pod "f808bccc-3450-447c-8b0a-7909fe189edd" (UID: "f808bccc-3450-447c-8b0a-7909fe189edd"). InnerVolumeSpecName "kube-api-access-82zm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.174667 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f808bccc-3450-447c-8b0a-7909fe189edd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f808bccc-3450-447c-8b0a-7909fe189edd" (UID: "f808bccc-3450-447c-8b0a-7909fe189edd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.188570 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba19f04-7cc2-44da-bb2d-cd4200fce325-kube-api-access-mk8xl" (OuterVolumeSpecName: "kube-api-access-mk8xl") pod "8ba19f04-7cc2-44da-bb2d-cd4200fce325" (UID: "8ba19f04-7cc2-44da-bb2d-cd4200fce325"). InnerVolumeSpecName "kube-api-access-mk8xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.205409 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba19f04-7cc2-44da-bb2d-cd4200fce325-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8ba19f04-7cc2-44da-bb2d-cd4200fce325" (UID: "8ba19f04-7cc2-44da-bb2d-cd4200fce325"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.262236 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-config\") pod \"route-controller-manager-6495544757-v6422\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.262320 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-client-ca\") pod \"route-controller-manager-6495544757-v6422\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.262362 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgf4r\" (UniqueName: \"kubernetes.io/projected/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-kube-api-access-vgf4r\") pod \"route-controller-manager-6495544757-v6422\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.262451 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-serving-cert\") pod \"route-controller-manager-6495544757-v6422\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.262536 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk8xl\" (UniqueName: \"kubernetes.io/projected/8ba19f04-7cc2-44da-bb2d-cd4200fce325-kube-api-access-mk8xl\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.262551 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82zm2\" (UniqueName: \"kubernetes.io/projected/f808bccc-3450-447c-8b0a-7909fe189edd-kube-api-access-82zm2\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.262567 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f808bccc-3450-447c-8b0a-7909fe189edd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.262579 4580 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.262591 4580 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ba19f04-7cc2-44da-bb2d-cd4200fce325-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.262603 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ba19f04-7cc2-44da-bb2d-cd4200fce325-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.262615 4580 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.262629 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ba19f04-7cc2-44da-bb2d-cd4200fce325-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.262642 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f808bccc-3450-447c-8b0a-7909fe189edd-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.266307 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-config\") pod \"route-controller-manager-6495544757-v6422\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.269846 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-client-ca\") pod \"route-controller-manager-6495544757-v6422\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.284502 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-serving-cert\") pod \"route-controller-manager-6495544757-v6422\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.304600 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgf4r\" (UniqueName: \"kubernetes.io/projected/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-kube-api-access-vgf4r\") pod \"route-controller-manager-6495544757-v6422\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.429662 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.802571 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" event={"ID":"f808bccc-3450-447c-8b0a-7909fe189edd","Type":"ContainerDied","Data":"6ecfdea3ae881c688ca9dfc72e0126c16d9b34aff4b1feec0187f183380ba216"} Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.802752 4580 scope.go:117] "RemoveContainer" containerID="6720f1dce227ed179bd467b0ce0472d88884ddfb7999e75d935fd53bca12757c" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.802894 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d77679dc6-9xg8t" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.836708 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" event={"ID":"8ba19f04-7cc2-44da-bb2d-cd4200fce325","Type":"ContainerDied","Data":"235a765b2e7f702b7ec607fa072761fe656a2bca6e1c6830b822475e0e402ab0"} Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.836882 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c" Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.866184 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d77679dc6-9xg8t"] Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.875936 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7d77679dc6-9xg8t"] Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.910909 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c"] Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.919899 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d6d44c798-fvr5c"] Mar 21 04:56:04 crc kubenswrapper[4580]: I0321 04:56:04.951020 4580 scope.go:117] "RemoveContainer" containerID="bc84a9c67686b5e71d1d670f234291690a13c0dce766750bd70ff8779ec5713b" Mar 21 04:56:05 crc kubenswrapper[4580]: I0321 04:56:05.065831 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6495544757-v6422"] Mar 21 04:56:05 crc kubenswrapper[4580]: I0321 04:56:05.067252 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:05 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:05 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:05 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:05 crc kubenswrapper[4580]: I0321 04:56:05.067342 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:05 crc kubenswrapper[4580]: W0321 04:56:05.099375 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcf9a8dc_2589_46ca_8063_4d981d5e8ca5.slice/crio-d004d4cc1e8b44eca01cbc47e6e6a0c8da496e3006324fbb5b9d5a3593a2ce19 WatchSource:0}: Error finding container d004d4cc1e8b44eca01cbc47e6e6a0c8da496e3006324fbb5b9d5a3593a2ce19: Status 404 returned error can't find the container with id d004d4cc1e8b44eca01cbc47e6e6a0c8da496e3006324fbb5b9d5a3593a2ce19 Mar 21 04:56:05 crc kubenswrapper[4580]: I0321 04:56:05.668540 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ba19f04-7cc2-44da-bb2d-cd4200fce325" path="/var/lib/kubelet/pods/8ba19f04-7cc2-44da-bb2d-cd4200fce325/volumes" Mar 21 04:56:05 crc kubenswrapper[4580]: I0321 04:56:05.669453 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f808bccc-3450-447c-8b0a-7909fe189edd" path="/var/lib/kubelet/pods/f808bccc-3450-447c-8b0a-7909fe189edd/volumes" Mar 21 04:56:05 crc kubenswrapper[4580]: I0321 04:56:05.911894 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" event={"ID":"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5","Type":"ContainerStarted","Data":"b0a4b4b4556c39d3976b59720fa359dfb372dabfaa8788b45b7d415190bb0df7"} Mar 21 04:56:05 crc kubenswrapper[4580]: I0321 04:56:05.911959 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" event={"ID":"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5","Type":"ContainerStarted","Data":"d004d4cc1e8b44eca01cbc47e6e6a0c8da496e3006324fbb5b9d5a3593a2ce19"} Mar 21 04:56:05 crc kubenswrapper[4580]: I0321 04:56:05.912561 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:05 crc kubenswrapper[4580]: I0321 04:56:05.947198 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" podStartSLOduration=3.947173003 podStartE2EDuration="3.947173003s" podCreationTimestamp="2026-03-21 04:56:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:56:05.945016899 +0000 UTC m=+271.027600547" watchObservedRunningTime="2026-03-21 04:56:05.947173003 +0000 UTC m=+271.029756631" Mar 21 04:56:06 crc kubenswrapper[4580]: I0321 04:56:06.046763 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:06 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:06 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:06 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:06 crc kubenswrapper[4580]: I0321 04:56:06.047569 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:06 crc kubenswrapper[4580]: I0321 04:56:06.094792 4580 ???:1] "http: TLS handshake error from 192.168.126.11:60434: no serving certificate available for the kubelet" Mar 21 04:56:06 crc kubenswrapper[4580]: I0321 04:56:06.178366 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.039041 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d8c6c5666-76nkv"] Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.044256 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.049377 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:07 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:07 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:07 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.053212 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.054003 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.059360 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.058279 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.058363 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.058524 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.062595 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.066389 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d8c6c5666-76nkv"] Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.067669 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.134231 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e8c7d5d-ac59-4a65-bd18-afb5723adade-serving-cert\") pod \"controller-manager-6d8c6c5666-76nkv\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.134294 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-client-ca\") pod \"controller-manager-6d8c6c5666-76nkv\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.134368 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-proxy-ca-bundles\") pod \"controller-manager-6d8c6c5666-76nkv\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.134390 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjfdx\" (UniqueName: \"kubernetes.io/projected/4e8c7d5d-ac59-4a65-bd18-afb5723adade-kube-api-access-kjfdx\") pod \"controller-manager-6d8c6c5666-76nkv\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.134414 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-config\") pod \"controller-manager-6d8c6c5666-76nkv\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.235529 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjfdx\" (UniqueName: \"kubernetes.io/projected/4e8c7d5d-ac59-4a65-bd18-afb5723adade-kube-api-access-kjfdx\") pod \"controller-manager-6d8c6c5666-76nkv\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.235586 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-config\") pod \"controller-manager-6d8c6c5666-76nkv\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.235615 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e8c7d5d-ac59-4a65-bd18-afb5723adade-serving-cert\") pod \"controller-manager-6d8c6c5666-76nkv\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.235643 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-client-ca\") pod \"controller-manager-6d8c6c5666-76nkv\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.235708 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-proxy-ca-bundles\") pod \"controller-manager-6d8c6c5666-76nkv\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.236865 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-proxy-ca-bundles\") pod \"controller-manager-6d8c6c5666-76nkv\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.238008 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-client-ca\") pod \"controller-manager-6d8c6c5666-76nkv\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.238120 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-config\") pod \"controller-manager-6d8c6c5666-76nkv\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.253936 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjfdx\" (UniqueName: \"kubernetes.io/projected/4e8c7d5d-ac59-4a65-bd18-afb5723adade-kube-api-access-kjfdx\") pod \"controller-manager-6d8c6c5666-76nkv\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.262203 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e8c7d5d-ac59-4a65-bd18-afb5723adade-serving-cert\") pod \"controller-manager-6d8c6c5666-76nkv\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.374538 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:07 crc kubenswrapper[4580]: I0321 04:56:07.974215 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d8c6c5666-76nkv"] Mar 21 04:56:08 crc kubenswrapper[4580]: W0321 04:56:08.010641 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e8c7d5d_ac59_4a65_bd18_afb5723adade.slice/crio-6f10f73805a95b282d9e8b461773fa0a5b0926d9a7f262abf7ae7f5a3a017c11 WatchSource:0}: Error finding container 6f10f73805a95b282d9e8b461773fa0a5b0926d9a7f262abf7ae7f5a3a017c11: Status 404 returned error can't find the container with id 6f10f73805a95b282d9e8b461773fa0a5b0926d9a7f262abf7ae7f5a3a017c11 Mar 21 04:56:08 crc kubenswrapper[4580]: I0321 04:56:08.047971 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:08 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:08 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:08 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:08 crc kubenswrapper[4580]: I0321 04:56:08.048063 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:08 crc kubenswrapper[4580]: I0321 04:56:08.151528 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" event={"ID":"4e8c7d5d-ac59-4a65-bd18-afb5723adade","Type":"ContainerStarted","Data":"6f10f73805a95b282d9e8b461773fa0a5b0926d9a7f262abf7ae7f5a3a017c11"} Mar 21 04:56:09 crc kubenswrapper[4580]: I0321 04:56:09.051247 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:09 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:09 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:09 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:09 crc kubenswrapper[4580]: I0321 04:56:09.051691 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:09 crc kubenswrapper[4580]: I0321 04:56:09.202023 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" event={"ID":"4e8c7d5d-ac59-4a65-bd18-afb5723adade","Type":"ContainerStarted","Data":"95932d7a952251e0067b0e1d3b15d862cee957235add25ab9fe7b991301d046a"} Mar 21 04:56:09 crc kubenswrapper[4580]: I0321 04:56:09.203218 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:09 crc kubenswrapper[4580]: I0321 04:56:09.228004 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:09 crc kubenswrapper[4580]: I0321 04:56:09.270388 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" podStartSLOduration=7.270367466 podStartE2EDuration="7.270367466s" podCreationTimestamp="2026-03-21 04:56:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:56:09.241178869 +0000 UTC m=+274.323762507" watchObservedRunningTime="2026-03-21 04:56:09.270367466 +0000 UTC m=+274.352951084" Mar 21 04:56:10 crc kubenswrapper[4580]: I0321 04:56:10.048138 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:10 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:10 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:10 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:10 crc kubenswrapper[4580]: I0321 04:56:10.048831 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:10 crc kubenswrapper[4580]: I0321 04:56:10.160762 4580 patch_prober.go:28] interesting pod/console-f9d7485db-48dqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 21 04:56:10 crc kubenswrapper[4580]: I0321 04:56:10.160899 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-48dqz" podUID="bcade120-6711-4045-9149-08985699febd" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 21 04:56:10 crc kubenswrapper[4580]: I0321 04:56:10.273047 4580 patch_prober.go:28] interesting pod/downloads-7954f5f757-4ffj8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 21 04:56:10 crc kubenswrapper[4580]: I0321 04:56:10.273122 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 21 04:56:10 crc kubenswrapper[4580]: I0321 04:56:10.273201 4580 patch_prober.go:28] interesting pod/downloads-7954f5f757-4ffj8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 21 04:56:10 crc kubenswrapper[4580]: I0321 04:56:10.273277 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 21 04:56:10 crc kubenswrapper[4580]: I0321 04:56:10.273334 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-4ffj8" Mar 21 04:56:10 crc kubenswrapper[4580]: I0321 04:56:10.274424 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"60bf223bf0fffddead1d8dc64a91349c534f1f50c12865c63206c9a247adefbc"} pod="openshift-console/downloads-7954f5f757-4ffj8" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 21 04:56:10 crc kubenswrapper[4580]: I0321 04:56:10.274468 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" containerID="cri-o://60bf223bf0fffddead1d8dc64a91349c534f1f50c12865c63206c9a247adefbc" gracePeriod=2 Mar 21 04:56:10 crc kubenswrapper[4580]: I0321 04:56:10.274731 4580 patch_prober.go:28] interesting pod/downloads-7954f5f757-4ffj8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 21 04:56:10 crc kubenswrapper[4580]: I0321 04:56:10.274767 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 21 04:56:11 crc kubenswrapper[4580]: I0321 04:56:11.047102 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:11 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:11 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:11 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:11 crc kubenswrapper[4580]: I0321 04:56:11.047510 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:11 crc kubenswrapper[4580]: I0321 04:56:11.259427 4580 generic.go:334] "Generic (PLEG): container finished" podID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerID="60bf223bf0fffddead1d8dc64a91349c534f1f50c12865c63206c9a247adefbc" exitCode=0 Mar 21 04:56:11 crc kubenswrapper[4580]: I0321 04:56:11.259665 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4ffj8" event={"ID":"69b1f163-8594-47b1-85c7-3330e0d50d8f","Type":"ContainerDied","Data":"60bf223bf0fffddead1d8dc64a91349c534f1f50c12865c63206c9a247adefbc"} Mar 21 04:56:12 crc kubenswrapper[4580]: I0321 04:56:12.047208 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:12 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:12 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:12 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:12 crc kubenswrapper[4580]: I0321 04:56:12.047287 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:13 crc kubenswrapper[4580]: I0321 04:56:13.046380 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:13 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:13 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:13 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:13 crc kubenswrapper[4580]: I0321 04:56:13.046488 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:14 crc kubenswrapper[4580]: I0321 04:56:14.047111 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:14 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:14 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:14 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:14 crc kubenswrapper[4580]: I0321 04:56:14.047168 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:15 crc kubenswrapper[4580]: I0321 04:56:15.047330 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:15 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:15 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:15 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:15 crc kubenswrapper[4580]: I0321 04:56:15.047429 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:15 crc kubenswrapper[4580]: I0321 04:56:15.948648 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:56:15 crc kubenswrapper[4580]: I0321 04:56:15.948750 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:56:16 crc kubenswrapper[4580]: I0321 04:56:16.048760 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:16 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:16 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:16 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:16 crc kubenswrapper[4580]: I0321 04:56:16.048865 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:16 crc kubenswrapper[4580]: I0321 04:56:16.862504 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:56:17 crc kubenswrapper[4580]: I0321 04:56:17.046754 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:17 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:17 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:17 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:17 crc kubenswrapper[4580]: I0321 04:56:17.046858 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:18 crc kubenswrapper[4580]: I0321 04:56:18.047697 4580 patch_prober.go:28] interesting pod/router-default-5444994796-dnnd6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:56:18 crc kubenswrapper[4580]: [-]has-synced failed: reason withheld Mar 21 04:56:18 crc kubenswrapper[4580]: [+]process-running ok Mar 21 04:56:18 crc kubenswrapper[4580]: healthz check failed Mar 21 04:56:18 crc kubenswrapper[4580]: I0321 04:56:18.047794 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dnnd6" podUID="41dae12a-fc3b-4e2b-a64a-f4f4c791afbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:56:19 crc kubenswrapper[4580]: I0321 04:56:19.046801 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:56:19 crc kubenswrapper[4580]: I0321 04:56:19.050840 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-dnnd6" Mar 21 04:56:20 crc kubenswrapper[4580]: I0321 04:56:20.160622 4580 patch_prober.go:28] interesting pod/console-f9d7485db-48dqz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 21 04:56:20 crc kubenswrapper[4580]: I0321 04:56:20.160695 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-48dqz" podUID="bcade120-6711-4045-9149-08985699febd" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 21 04:56:20 crc kubenswrapper[4580]: I0321 04:56:20.271880 4580 patch_prober.go:28] interesting pod/downloads-7954f5f757-4ffj8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 21 04:56:20 crc kubenswrapper[4580]: I0321 04:56:20.271970 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 21 04:56:21 crc kubenswrapper[4580]: I0321 04:56:21.059323 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kzsgp" Mar 21 04:56:22 crc kubenswrapper[4580]: I0321 04:56:22.569671 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d8c6c5666-76nkv"] Mar 21 04:56:22 crc kubenswrapper[4580]: I0321 04:56:22.570029 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" podUID="4e8c7d5d-ac59-4a65-bd18-afb5723adade" containerName="controller-manager" containerID="cri-o://95932d7a952251e0067b0e1d3b15d862cee957235add25ab9fe7b991301d046a" gracePeriod=30 Mar 21 04:56:22 crc kubenswrapper[4580]: I0321 04:56:22.663364 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6495544757-v6422"] Mar 21 04:56:22 crc kubenswrapper[4580]: I0321 04:56:22.664143 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" podUID="bcf9a8dc-2589-46ca-8063-4d981d5e8ca5" containerName="route-controller-manager" containerID="cri-o://b0a4b4b4556c39d3976b59720fa359dfb372dabfaa8788b45b7d415190bb0df7" gracePeriod=30 Mar 21 04:56:24 crc kubenswrapper[4580]: I0321 04:56:24.383302 4580 generic.go:334] "Generic (PLEG): container finished" podID="4e8c7d5d-ac59-4a65-bd18-afb5723adade" containerID="95932d7a952251e0067b0e1d3b15d862cee957235add25ab9fe7b991301d046a" exitCode=0 Mar 21 04:56:24 crc kubenswrapper[4580]: I0321 04:56:24.383379 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" event={"ID":"4e8c7d5d-ac59-4a65-bd18-afb5723adade","Type":"ContainerDied","Data":"95932d7a952251e0067b0e1d3b15d862cee957235add25ab9fe7b991301d046a"} Mar 21 04:56:24 crc kubenswrapper[4580]: I0321 04:56:24.431628 4580 patch_prober.go:28] interesting pod/route-controller-manager-6495544757-v6422 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Mar 21 04:56:24 crc kubenswrapper[4580]: I0321 04:56:24.431742 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" podUID="bcf9a8dc-2589-46ca-8063-4d981d5e8ca5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Mar 21 04:56:24 crc kubenswrapper[4580]: I0321 04:56:24.820517 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 21 04:56:24 crc kubenswrapper[4580]: I0321 04:56:24.821613 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:56:24 crc kubenswrapper[4580]: I0321 04:56:24.824197 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 21 04:56:24 crc kubenswrapper[4580]: I0321 04:56:24.824859 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 21 04:56:24 crc kubenswrapper[4580]: I0321 04:56:24.831073 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 21 04:56:24 crc kubenswrapper[4580]: I0321 04:56:24.978724 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/317591a6-758a-45a3-b1a7-ca32fb8f6f34-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"317591a6-758a-45a3-b1a7-ca32fb8f6f34\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:56:24 crc kubenswrapper[4580]: I0321 04:56:24.978868 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/317591a6-758a-45a3-b1a7-ca32fb8f6f34-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"317591a6-758a-45a3-b1a7-ca32fb8f6f34\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:56:25 crc kubenswrapper[4580]: I0321 04:56:25.080428 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/317591a6-758a-45a3-b1a7-ca32fb8f6f34-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"317591a6-758a-45a3-b1a7-ca32fb8f6f34\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:56:25 crc kubenswrapper[4580]: I0321 04:56:25.081495 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/317591a6-758a-45a3-b1a7-ca32fb8f6f34-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"317591a6-758a-45a3-b1a7-ca32fb8f6f34\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:56:25 crc kubenswrapper[4580]: I0321 04:56:25.081698 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/317591a6-758a-45a3-b1a7-ca32fb8f6f34-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"317591a6-758a-45a3-b1a7-ca32fb8f6f34\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:56:25 crc kubenswrapper[4580]: I0321 04:56:25.132464 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/317591a6-758a-45a3-b1a7-ca32fb8f6f34-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"317591a6-758a-45a3-b1a7-ca32fb8f6f34\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:56:25 crc kubenswrapper[4580]: I0321 04:56:25.179468 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:56:25 crc kubenswrapper[4580]: I0321 04:56:25.391823 4580 generic.go:334] "Generic (PLEG): container finished" podID="bcf9a8dc-2589-46ca-8063-4d981d5e8ca5" containerID="b0a4b4b4556c39d3976b59720fa359dfb372dabfaa8788b45b7d415190bb0df7" exitCode=0 Mar 21 04:56:25 crc kubenswrapper[4580]: I0321 04:56:25.391933 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" event={"ID":"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5","Type":"ContainerDied","Data":"b0a4b4b4556c39d3976b59720fa359dfb372dabfaa8788b45b7d415190bb0df7"} Mar 21 04:56:26 crc kubenswrapper[4580]: I0321 04:56:26.606978 4580 ???:1] "http: TLS handshake error from 192.168.126.11:55844: no serving certificate available for the kubelet" Mar 21 04:56:28 crc kubenswrapper[4580]: I0321 04:56:28.376292 4580 patch_prober.go:28] interesting pod/controller-manager-6d8c6c5666-76nkv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:56:28 crc kubenswrapper[4580]: I0321 04:56:28.376866 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" podUID="4e8c7d5d-ac59-4a65-bd18-afb5723adade" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:56:29 crc kubenswrapper[4580]: I0321 04:56:29.006032 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 21 04:56:29 crc kubenswrapper[4580]: I0321 04:56:29.007423 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:56:29 crc kubenswrapper[4580]: I0321 04:56:29.018969 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 21 04:56:29 crc kubenswrapper[4580]: I0321 04:56:29.145214 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-kube-api-access\") pod \"installer-9-crc\" (UID: \"a5a57185-3ae7-49c8-bc2f-5a57f2be7429\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:56:29 crc kubenswrapper[4580]: I0321 04:56:29.145299 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a5a57185-3ae7-49c8-bc2f-5a57f2be7429\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:56:29 crc kubenswrapper[4580]: I0321 04:56:29.145318 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-var-lock\") pod \"installer-9-crc\" (UID: \"a5a57185-3ae7-49c8-bc2f-5a57f2be7429\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:56:29 crc kubenswrapper[4580]: I0321 04:56:29.246167 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-kube-api-access\") pod \"installer-9-crc\" (UID: \"a5a57185-3ae7-49c8-bc2f-5a57f2be7429\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:56:29 crc kubenswrapper[4580]: I0321 04:56:29.246230 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a5a57185-3ae7-49c8-bc2f-5a57f2be7429\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:56:29 crc kubenswrapper[4580]: I0321 04:56:29.246258 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-var-lock\") pod \"installer-9-crc\" (UID: \"a5a57185-3ae7-49c8-bc2f-5a57f2be7429\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:56:29 crc kubenswrapper[4580]: I0321 04:56:29.246335 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a5a57185-3ae7-49c8-bc2f-5a57f2be7429\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:56:29 crc kubenswrapper[4580]: I0321 04:56:29.246398 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-var-lock\") pod \"installer-9-crc\" (UID: \"a5a57185-3ae7-49c8-bc2f-5a57f2be7429\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:56:29 crc kubenswrapper[4580]: I0321 04:56:29.269893 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-kube-api-access\") pod \"installer-9-crc\" (UID: \"a5a57185-3ae7-49c8-bc2f-5a57f2be7429\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:56:29 crc kubenswrapper[4580]: I0321 04:56:29.346593 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:56:30 crc kubenswrapper[4580]: I0321 04:56:30.167086 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:56:30 crc kubenswrapper[4580]: I0321 04:56:30.172314 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-48dqz" Mar 21 04:56:30 crc kubenswrapper[4580]: I0321 04:56:30.271072 4580 patch_prober.go:28] interesting pod/downloads-7954f5f757-4ffj8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 21 04:56:30 crc kubenswrapper[4580]: I0321 04:56:30.271145 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.339430 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.373462 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h"] Mar 21 04:56:33 crc kubenswrapper[4580]: E0321 04:56:33.373731 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e8c7d5d-ac59-4a65-bd18-afb5723adade" containerName="controller-manager" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.373749 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e8c7d5d-ac59-4a65-bd18-afb5723adade" containerName="controller-manager" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.374023 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e8c7d5d-ac59-4a65-bd18-afb5723adade" containerName="controller-manager" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.374701 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.392621 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h"] Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.415691 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-proxy-ca-bundles\") pod \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.415770 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e8c7d5d-ac59-4a65-bd18-afb5723adade-serving-cert\") pod \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.415836 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjfdx\" (UniqueName: \"kubernetes.io/projected/4e8c7d5d-ac59-4a65-bd18-afb5723adade-kube-api-access-kjfdx\") pod \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.415973 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-config\") pod \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.416036 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-client-ca\") pod \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\" (UID: \"4e8c7d5d-ac59-4a65-bd18-afb5723adade\") " Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.417089 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-client-ca" (OuterVolumeSpecName: "client-ca") pod "4e8c7d5d-ac59-4a65-bd18-afb5723adade" (UID: "4e8c7d5d-ac59-4a65-bd18-afb5723adade"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.417146 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-config" (OuterVolumeSpecName: "config") pod "4e8c7d5d-ac59-4a65-bd18-afb5723adade" (UID: "4e8c7d5d-ac59-4a65-bd18-afb5723adade"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.417254 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4e8c7d5d-ac59-4a65-bd18-afb5723adade" (UID: "4e8c7d5d-ac59-4a65-bd18-afb5723adade"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.417610 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.417634 4580 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.417652 4580 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4e8c7d5d-ac59-4a65-bd18-afb5723adade-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.426625 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e8c7d5d-ac59-4a65-bd18-afb5723adade-kube-api-access-kjfdx" (OuterVolumeSpecName: "kube-api-access-kjfdx") pod "4e8c7d5d-ac59-4a65-bd18-afb5723adade" (UID: "4e8c7d5d-ac59-4a65-bd18-afb5723adade"). InnerVolumeSpecName "kube-api-access-kjfdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.429870 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e8c7d5d-ac59-4a65-bd18-afb5723adade-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4e8c7d5d-ac59-4a65-bd18-afb5723adade" (UID: "4e8c7d5d-ac59-4a65-bd18-afb5723adade"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.459272 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" event={"ID":"4e8c7d5d-ac59-4a65-bd18-afb5723adade","Type":"ContainerDied","Data":"6f10f73805a95b282d9e8b461773fa0a5b0926d9a7f262abf7ae7f5a3a017c11"} Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.459327 4580 scope.go:117] "RemoveContainer" containerID="95932d7a952251e0067b0e1d3b15d862cee957235add25ab9fe7b991301d046a" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.459436 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d8c6c5666-76nkv" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.494530 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d8c6c5666-76nkv"] Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.494903 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6d8c6c5666-76nkv"] Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.519056 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx65g\" (UniqueName: \"kubernetes.io/projected/6fe6e371-faf7-4749-bcab-f7c196f5d080-kube-api-access-dx65g\") pod \"controller-manager-77d6c4bb9c-l5w4h\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.519111 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-client-ca\") pod \"controller-manager-77d6c4bb9c-l5w4h\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.519268 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-proxy-ca-bundles\") pod \"controller-manager-77d6c4bb9c-l5w4h\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.519299 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fe6e371-faf7-4749-bcab-f7c196f5d080-serving-cert\") pod \"controller-manager-77d6c4bb9c-l5w4h\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.519331 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-config\") pod \"controller-manager-77d6c4bb9c-l5w4h\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.519374 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e8c7d5d-ac59-4a65-bd18-afb5723adade-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.519387 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjfdx\" (UniqueName: \"kubernetes.io/projected/4e8c7d5d-ac59-4a65-bd18-afb5723adade-kube-api-access-kjfdx\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.620228 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-proxy-ca-bundles\") pod \"controller-manager-77d6c4bb9c-l5w4h\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.620276 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fe6e371-faf7-4749-bcab-f7c196f5d080-serving-cert\") pod \"controller-manager-77d6c4bb9c-l5w4h\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.620304 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-config\") pod \"controller-manager-77d6c4bb9c-l5w4h\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.620333 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx65g\" (UniqueName: \"kubernetes.io/projected/6fe6e371-faf7-4749-bcab-f7c196f5d080-kube-api-access-dx65g\") pod \"controller-manager-77d6c4bb9c-l5w4h\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.620349 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-client-ca\") pod \"controller-manager-77d6c4bb9c-l5w4h\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.621336 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-client-ca\") pod \"controller-manager-77d6c4bb9c-l5w4h\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.622711 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-proxy-ca-bundles\") pod \"controller-manager-77d6c4bb9c-l5w4h\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.624463 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fe6e371-faf7-4749-bcab-f7c196f5d080-serving-cert\") pod \"controller-manager-77d6c4bb9c-l5w4h\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.625417 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-config\") pod \"controller-manager-77d6c4bb9c-l5w4h\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.643472 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx65g\" (UniqueName: \"kubernetes.io/projected/6fe6e371-faf7-4749-bcab-f7c196f5d080-kube-api-access-dx65g\") pod \"controller-manager-77d6c4bb9c-l5w4h\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.644277 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e8c7d5d-ac59-4a65-bd18-afb5723adade" path="/var/lib/kubelet/pods/4e8c7d5d-ac59-4a65-bd18-afb5723adade/volumes" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.721743 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.848161 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.875293 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.923820 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-client-ca\") pod \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.923896 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgf4r\" (UniqueName: \"kubernetes.io/projected/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-kube-api-access-vgf4r\") pod \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.923952 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-serving-cert\") pod \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.924078 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-config\") pod \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\" (UID: \"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5\") " Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.925529 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-client-ca" (OuterVolumeSpecName: "client-ca") pod "bcf9a8dc-2589-46ca-8063-4d981d5e8ca5" (UID: "bcf9a8dc-2589-46ca-8063-4d981d5e8ca5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.925696 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-config" (OuterVolumeSpecName: "config") pod "bcf9a8dc-2589-46ca-8063-4d981d5e8ca5" (UID: "bcf9a8dc-2589-46ca-8063-4d981d5e8ca5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.931265 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-kube-api-access-vgf4r" (OuterVolumeSpecName: "kube-api-access-vgf4r") pod "bcf9a8dc-2589-46ca-8063-4d981d5e8ca5" (UID: "bcf9a8dc-2589-46ca-8063-4d981d5e8ca5"). InnerVolumeSpecName "kube-api-access-vgf4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:33 crc kubenswrapper[4580]: I0321 04:56:33.931824 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bcf9a8dc-2589-46ca-8063-4d981d5e8ca5" (UID: "bcf9a8dc-2589-46ca-8063-4d981d5e8ca5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:34 crc kubenswrapper[4580]: I0321 04:56:34.026499 4580 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:34 crc kubenswrapper[4580]: I0321 04:56:34.026543 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgf4r\" (UniqueName: \"kubernetes.io/projected/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-kube-api-access-vgf4r\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:34 crc kubenswrapper[4580]: I0321 04:56:34.026556 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:34 crc kubenswrapper[4580]: I0321 04:56:34.026568 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:34 crc kubenswrapper[4580]: I0321 04:56:34.468346 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" Mar 21 04:56:34 crc kubenswrapper[4580]: I0321 04:56:34.468343 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6495544757-v6422" event={"ID":"bcf9a8dc-2589-46ca-8063-4d981d5e8ca5","Type":"ContainerDied","Data":"d004d4cc1e8b44eca01cbc47e6e6a0c8da496e3006324fbb5b9d5a3593a2ce19"} Mar 21 04:56:34 crc kubenswrapper[4580]: I0321 04:56:34.501576 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6495544757-v6422"] Mar 21 04:56:34 crc kubenswrapper[4580]: I0321 04:56:34.508547 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6495544757-v6422"] Mar 21 04:56:35 crc kubenswrapper[4580]: I0321 04:56:35.625846 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf9a8dc-2589-46ca-8063-4d981d5e8ca5" path="/var/lib/kubelet/pods/bcf9a8dc-2589-46ca-8063-4d981d5e8ca5/volumes" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.057419 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq"] Mar 21 04:56:36 crc kubenswrapper[4580]: E0321 04:56:36.057844 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf9a8dc-2589-46ca-8063-4d981d5e8ca5" containerName="route-controller-manager" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.057862 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf9a8dc-2589-46ca-8063-4d981d5e8ca5" containerName="route-controller-manager" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.057997 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf9a8dc-2589-46ca-8063-4d981d5e8ca5" containerName="route-controller-manager" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.058716 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.067804 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.068352 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.068614 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.068829 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.068984 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.069124 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.076718 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq"] Mar 21 04:56:36 crc kubenswrapper[4580]: E0321 04:56:36.084036 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 21 04:56:36 crc kubenswrapper[4580]: E0321 04:56:36.084210 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gpp8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4kpsb_openshift-marketplace(484933df-fe17-42ec-99da-d1187d674051): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:56:36 crc kubenswrapper[4580]: E0321 04:56:36.085499 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-4kpsb" podUID="484933df-fe17-42ec-99da-d1187d674051" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.157072 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9935eb9-b708-4943-a85a-50fe7cbd6777-client-ca\") pod \"route-controller-manager-5b7c7c868c-7dmjq\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.157236 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9935eb9-b708-4943-a85a-50fe7cbd6777-config\") pod \"route-controller-manager-5b7c7c868c-7dmjq\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.157264 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9935eb9-b708-4943-a85a-50fe7cbd6777-serving-cert\") pod \"route-controller-manager-5b7c7c868c-7dmjq\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.157287 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2pdn\" (UniqueName: \"kubernetes.io/projected/b9935eb9-b708-4943-a85a-50fe7cbd6777-kube-api-access-j2pdn\") pod \"route-controller-manager-5b7c7c868c-7dmjq\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.259004 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9935eb9-b708-4943-a85a-50fe7cbd6777-client-ca\") pod \"route-controller-manager-5b7c7c868c-7dmjq\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.259091 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9935eb9-b708-4943-a85a-50fe7cbd6777-config\") pod \"route-controller-manager-5b7c7c868c-7dmjq\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.259130 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9935eb9-b708-4943-a85a-50fe7cbd6777-serving-cert\") pod \"route-controller-manager-5b7c7c868c-7dmjq\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.259163 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2pdn\" (UniqueName: \"kubernetes.io/projected/b9935eb9-b708-4943-a85a-50fe7cbd6777-kube-api-access-j2pdn\") pod \"route-controller-manager-5b7c7c868c-7dmjq\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.260244 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9935eb9-b708-4943-a85a-50fe7cbd6777-client-ca\") pod \"route-controller-manager-5b7c7c868c-7dmjq\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.260425 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9935eb9-b708-4943-a85a-50fe7cbd6777-config\") pod \"route-controller-manager-5b7c7c868c-7dmjq\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.267984 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9935eb9-b708-4943-a85a-50fe7cbd6777-serving-cert\") pod \"route-controller-manager-5b7c7c868c-7dmjq\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.278859 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2pdn\" (UniqueName: \"kubernetes.io/projected/b9935eb9-b708-4943-a85a-50fe7cbd6777-kube-api-access-j2pdn\") pod \"route-controller-manager-5b7c7c868c-7dmjq\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.399410 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:36 crc kubenswrapper[4580]: I0321 04:56:36.771736 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kmfk5"] Mar 21 04:56:38 crc kubenswrapper[4580]: E0321 04:56:38.351232 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4kpsb" podUID="484933df-fe17-42ec-99da-d1187d674051" Mar 21 04:56:38 crc kubenswrapper[4580]: E0321 04:56:38.427286 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 21 04:56:38 crc kubenswrapper[4580]: E0321 04:56:38.427494 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6tbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-6ppjj_openshift-marketplace(c002830b-7ac1-4912-9b31-bad37ac63104): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:56:38 crc kubenswrapper[4580]: E0321 04:56:38.433985 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-6ppjj" podUID="c002830b-7ac1-4912-9b31-bad37ac63104" Mar 21 04:56:40 crc kubenswrapper[4580]: I0321 04:56:40.273078 4580 patch_prober.go:28] interesting pod/downloads-7954f5f757-4ffj8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 21 04:56:40 crc kubenswrapper[4580]: I0321 04:56:40.273150 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 21 04:56:44 crc kubenswrapper[4580]: E0321 04:56:44.716171 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-6ppjj" podUID="c002830b-7ac1-4912-9b31-bad37ac63104" Mar 21 04:56:44 crc kubenswrapper[4580]: E0321 04:56:44.836126 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 21 04:56:44 crc kubenswrapper[4580]: E0321 04:56:44.836599 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kmlgk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-ns8gg_openshift-marketplace(bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:56:44 crc kubenswrapper[4580]: E0321 04:56:44.837914 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-ns8gg" podUID="bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" Mar 21 04:56:44 crc kubenswrapper[4580]: E0321 04:56:44.985949 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 21 04:56:44 crc kubenswrapper[4580]: E0321 04:56:44.986572 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l2dsq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-77nmx_openshift-marketplace(37b3e873-7ca5-4413-9998-6aaf824d6cd7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:56:44 crc kubenswrapper[4580]: E0321 04:56:44.989934 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-77nmx" podUID="37b3e873-7ca5-4413-9998-6aaf824d6cd7" Mar 21 04:56:45 crc kubenswrapper[4580]: I0321 04:56:45.948206 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:56:45 crc kubenswrapper[4580]: I0321 04:56:45.948294 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:56:48 crc kubenswrapper[4580]: E0321 04:56:48.474036 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-ns8gg" podUID="bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" Mar 21 04:56:48 crc kubenswrapper[4580]: E0321 04:56:48.474649 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-77nmx" podUID="37b3e873-7ca5-4413-9998-6aaf824d6cd7" Mar 21 04:56:49 crc kubenswrapper[4580]: E0321 04:56:49.499588 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 21 04:56:49 crc kubenswrapper[4580]: E0321 04:56:49.500212 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pbthv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-n99sq_openshift-marketplace(82874992-faa8-4c73-955b-ffe5f02726a7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:56:49 crc kubenswrapper[4580]: E0321 04:56:49.501439 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-n99sq" podUID="82874992-faa8-4c73-955b-ffe5f02726a7" Mar 21 04:56:50 crc kubenswrapper[4580]: I0321 04:56:50.271991 4580 patch_prober.go:28] interesting pod/downloads-7954f5f757-4ffj8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 21 04:56:50 crc kubenswrapper[4580]: I0321 04:56:50.272077 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 21 04:56:50 crc kubenswrapper[4580]: E0321 04:56:50.619712 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 21 04:56:50 crc kubenswrapper[4580]: E0321 04:56:50.619907 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tlp7m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jq5t4_openshift-marketplace(1dd3cd12-741f-4993-8b39-994545e15c2c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:56:50 crc kubenswrapper[4580]: E0321 04:56:50.621838 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-jq5t4" podUID="1dd3cd12-741f-4993-8b39-994545e15c2c" Mar 21 04:56:51 crc kubenswrapper[4580]: E0321 04:56:51.192623 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-n99sq" podUID="82874992-faa8-4c73-955b-ffe5f02726a7" Mar 21 04:56:51 crc kubenswrapper[4580]: I0321 04:56:51.223144 4580 scope.go:117] "RemoveContainer" containerID="b0a4b4b4556c39d3976b59720fa359dfb372dabfaa8788b45b7d415190bb0df7" Mar 21 04:56:51 crc kubenswrapper[4580]: E0321 04:56:51.612158 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jq5t4" podUID="1dd3cd12-741f-4993-8b39-994545e15c2c" Mar 21 04:56:51 crc kubenswrapper[4580]: I0321 04:56:51.690678 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 21 04:56:51 crc kubenswrapper[4580]: I0321 04:56:51.965802 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 21 04:56:51 crc kubenswrapper[4580]: W0321 04:56:51.968285 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda5a57185_3ae7_49c8_bc2f_5a57f2be7429.slice/crio-57f6a5aff1e58ecf4c7792ff128663c538f41f7a4a175242168a06b4bfcc06e0 WatchSource:0}: Error finding container 57f6a5aff1e58ecf4c7792ff128663c538f41f7a4a175242168a06b4bfcc06e0: Status 404 returned error can't find the container with id 57f6a5aff1e58ecf4c7792ff128663c538f41f7a4a175242168a06b4bfcc06e0 Mar 21 04:56:51 crc kubenswrapper[4580]: I0321 04:56:51.971831 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq"] Mar 21 04:56:51 crc kubenswrapper[4580]: W0321 04:56:51.980793 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9935eb9_b708_4943_a85a_50fe7cbd6777.slice/crio-3a4d8ec9c1c850347a155ab721050a816e7ed7c8d294f46a0cc24edc1ab26285 WatchSource:0}: Error finding container 3a4d8ec9c1c850347a155ab721050a816e7ed7c8d294f46a0cc24edc1ab26285: Status 404 returned error can't find the container with id 3a4d8ec9c1c850347a155ab721050a816e7ed7c8d294f46a0cc24edc1ab26285 Mar 21 04:56:51 crc kubenswrapper[4580]: I0321 04:56:51.995675 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h"] Mar 21 04:56:52 crc kubenswrapper[4580]: E0321 04:56:52.395981 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 21 04:56:52 crc kubenswrapper[4580]: E0321 04:56:52.396667 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9dgk5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-47qgx_openshift-marketplace(bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:56:52 crc kubenswrapper[4580]: E0321 04:56:52.399052 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-47qgx" podUID="bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" Mar 21 04:56:52 crc kubenswrapper[4580]: I0321 04:56:52.638601 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a5a57185-3ae7-49c8-bc2f-5a57f2be7429","Type":"ContainerStarted","Data":"57f6a5aff1e58ecf4c7792ff128663c538f41f7a4a175242168a06b4bfcc06e0"} Mar 21 04:56:52 crc kubenswrapper[4580]: I0321 04:56:52.641165 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" event={"ID":"6fe6e371-faf7-4749-bcab-f7c196f5d080","Type":"ContainerStarted","Data":"eca1018e908d95a8aa144b56c3c3d170064ca50e3809620f1546cdb3ef67ef1d"} Mar 21 04:56:52 crc kubenswrapper[4580]: I0321 04:56:52.643856 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" event={"ID":"b9935eb9-b708-4943-a85a-50fe7cbd6777","Type":"ContainerStarted","Data":"3a4d8ec9c1c850347a155ab721050a816e7ed7c8d294f46a0cc24edc1ab26285"} Mar 21 04:56:52 crc kubenswrapper[4580]: I0321 04:56:52.648164 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"317591a6-758a-45a3-b1a7-ca32fb8f6f34","Type":"ContainerStarted","Data":"a7ddb9e4b2e44cf712dbaf75239c7e2fb86c3120d02d0150f61e3fc6a8ad234b"} Mar 21 04:56:52 crc kubenswrapper[4580]: E0321 04:56:52.651274 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-47qgx" podUID="bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" Mar 21 04:56:53 crc kubenswrapper[4580]: E0321 04:56:53.496080 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 21 04:56:53 crc kubenswrapper[4580]: E0321 04:56:53.496824 4580 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:56:53 crc kubenswrapper[4580]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 21 04:56:53 crc kubenswrapper[4580]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jpbx7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29567816-m2qj9_openshift-infra(0cb67fb0-cbe3-47cd-9029-f54e6e74729d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 21 04:56:53 crc kubenswrapper[4580]: > logger="UnhandledError" Mar 21 04:56:53 crc kubenswrapper[4580]: E0321 04:56:53.498131 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29567816-m2qj9" podUID="0cb67fb0-cbe3-47cd-9029-f54e6e74729d" Mar 21 04:56:53 crc kubenswrapper[4580]: I0321 04:56:53.655713 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" event={"ID":"6fe6e371-faf7-4749-bcab-f7c196f5d080","Type":"ContainerStarted","Data":"2fed3f0eb2a89d15bc3b276cad4e08c741a12c57e0f629a5f8e92f6b77199f01"} Mar 21 04:56:53 crc kubenswrapper[4580]: I0321 04:56:53.657104 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:53 crc kubenswrapper[4580]: I0321 04:56:53.657727 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"317591a6-758a-45a3-b1a7-ca32fb8f6f34","Type":"ContainerStarted","Data":"e7307ecfeb7e89eb107ca8fb0f9d72c0e2ec1a51fb1784baa382274ebeaa81ec"} Mar 21 04:56:53 crc kubenswrapper[4580]: I0321 04:56:53.659394 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4ffj8" event={"ID":"69b1f163-8594-47b1-85c7-3330e0d50d8f","Type":"ContainerStarted","Data":"2491afd08af2712d7bbb02b11400e10805966dda62d14b16423a0eeb8e5063d1"} Mar 21 04:56:53 crc kubenswrapper[4580]: I0321 04:56:53.659548 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-4ffj8" Mar 21 04:56:53 crc kubenswrapper[4580]: I0321 04:56:53.660202 4580 patch_prober.go:28] interesting pod/downloads-7954f5f757-4ffj8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 21 04:56:53 crc kubenswrapper[4580]: I0321 04:56:53.660273 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 21 04:56:53 crc kubenswrapper[4580]: I0321 04:56:53.661809 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a5a57185-3ae7-49c8-bc2f-5a57f2be7429","Type":"ContainerStarted","Data":"e6e08e5148277152c14110df862955833c2821828d38f35e5ee8f18c0b32279e"} Mar 21 04:56:53 crc kubenswrapper[4580]: I0321 04:56:53.664718 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" event={"ID":"b9935eb9-b708-4943-a85a-50fe7cbd6777","Type":"ContainerStarted","Data":"685894d22482b0b37f9c59de46b6829b0b7df9fcb555b29653219f4a673e05c1"} Mar 21 04:56:53 crc kubenswrapper[4580]: I0321 04:56:53.665258 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:53 crc kubenswrapper[4580]: E0321 04:56:53.667287 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29567816-m2qj9" podUID="0cb67fb0-cbe3-47cd-9029-f54e6e74729d" Mar 21 04:56:53 crc kubenswrapper[4580]: I0321 04:56:53.678350 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:56:53 crc kubenswrapper[4580]: I0321 04:56:53.697653 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" podStartSLOduration=31.697615654 podStartE2EDuration="31.697615654s" podCreationTimestamp="2026-03-21 04:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:56:53.693684307 +0000 UTC m=+318.776267935" watchObservedRunningTime="2026-03-21 04:56:53.697615654 +0000 UTC m=+318.780199282" Mar 21 04:56:53 crc kubenswrapper[4580]: I0321 04:56:53.736536 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=29.736490288 podStartE2EDuration="29.736490288s" podCreationTimestamp="2026-03-21 04:56:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:56:53.734903721 +0000 UTC m=+318.817487359" watchObservedRunningTime="2026-03-21 04:56:53.736490288 +0000 UTC m=+318.819073926" Mar 21 04:56:53 crc kubenswrapper[4580]: I0321 04:56:53.858924 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" podStartSLOduration=31.858902143 podStartE2EDuration="31.858902143s" podCreationTimestamp="2026-03-21 04:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:56:53.857080699 +0000 UTC m=+318.939664327" watchObservedRunningTime="2026-03-21 04:56:53.858902143 +0000 UTC m=+318.941485771" Mar 21 04:56:53 crc kubenswrapper[4580]: I0321 04:56:53.961471 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:56:54 crc kubenswrapper[4580]: E0321 04:56:54.261014 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 21 04:56:54 crc kubenswrapper[4580]: E0321 04:56:54.261258 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vnx45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-kk5nv_openshift-marketplace(9940b0fa-e788-4da2-af4f-da4cdc60f12d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:56:54 crc kubenswrapper[4580]: E0321 04:56:54.262558 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-kk5nv" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" Mar 21 04:56:54 crc kubenswrapper[4580]: E0321 04:56:54.305536 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 21 04:56:54 crc kubenswrapper[4580]: E0321 04:56:54.306148 4580 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:56:54 crc kubenswrapper[4580]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 21 04:56:54 crc kubenswrapper[4580]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7njkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29567814-8cxbg_openshift-infra(1714688f-61d5-436b-baaf-2668757942fd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 21 04:56:54 crc kubenswrapper[4580]: > logger="UnhandledError" Mar 21 04:56:54 crc kubenswrapper[4580]: E0321 04:56:54.307749 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29567814-8cxbg" podUID="1714688f-61d5-436b-baaf-2668757942fd" Mar 21 04:56:54 crc kubenswrapper[4580]: I0321 04:56:54.670574 4580 generic.go:334] "Generic (PLEG): container finished" podID="317591a6-758a-45a3-b1a7-ca32fb8f6f34" containerID="e7307ecfeb7e89eb107ca8fb0f9d72c0e2ec1a51fb1784baa382274ebeaa81ec" exitCode=0 Mar 21 04:56:54 crc kubenswrapper[4580]: I0321 04:56:54.670642 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"317591a6-758a-45a3-b1a7-ca32fb8f6f34","Type":"ContainerDied","Data":"e7307ecfeb7e89eb107ca8fb0f9d72c0e2ec1a51fb1784baa382274ebeaa81ec"} Mar 21 04:56:54 crc kubenswrapper[4580]: I0321 04:56:54.672742 4580 patch_prober.go:28] interesting pod/downloads-7954f5f757-4ffj8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 21 04:56:54 crc kubenswrapper[4580]: I0321 04:56:54.672819 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4ffj8" podUID="69b1f163-8594-47b1-85c7-3330e0d50d8f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 21 04:56:54 crc kubenswrapper[4580]: E0321 04:56:54.673515 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29567814-8cxbg" podUID="1714688f-61d5-436b-baaf-2668757942fd" Mar 21 04:56:54 crc kubenswrapper[4580]: E0321 04:56:54.674539 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-kk5nv" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" Mar 21 04:56:54 crc kubenswrapper[4580]: I0321 04:56:54.747228 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=26.747197589 podStartE2EDuration="26.747197589s" podCreationTimestamp="2026-03-21 04:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:56:54.744636433 +0000 UTC m=+319.827220061" watchObservedRunningTime="2026-03-21 04:56:54.747197589 +0000 UTC m=+319.829781227" Mar 21 04:56:55 crc kubenswrapper[4580]: I0321 04:56:55.679519 4580 generic.go:334] "Generic (PLEG): container finished" podID="484933df-fe17-42ec-99da-d1187d674051" containerID="d56fcd27d3ef488fd44c2f25e326ca8c29f0c152404adb1855d11cafe0ae7519" exitCode=0 Mar 21 04:56:55 crc kubenswrapper[4580]: I0321 04:56:55.680209 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kpsb" event={"ID":"484933df-fe17-42ec-99da-d1187d674051","Type":"ContainerDied","Data":"d56fcd27d3ef488fd44c2f25e326ca8c29f0c152404adb1855d11cafe0ae7519"} Mar 21 04:56:56 crc kubenswrapper[4580]: I0321 04:56:56.069492 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:56:56 crc kubenswrapper[4580]: I0321 04:56:56.178736 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/317591a6-758a-45a3-b1a7-ca32fb8f6f34-kubelet-dir\") pod \"317591a6-758a-45a3-b1a7-ca32fb8f6f34\" (UID: \"317591a6-758a-45a3-b1a7-ca32fb8f6f34\") " Mar 21 04:56:56 crc kubenswrapper[4580]: I0321 04:56:56.178885 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/317591a6-758a-45a3-b1a7-ca32fb8f6f34-kube-api-access\") pod \"317591a6-758a-45a3-b1a7-ca32fb8f6f34\" (UID: \"317591a6-758a-45a3-b1a7-ca32fb8f6f34\") " Mar 21 04:56:56 crc kubenswrapper[4580]: I0321 04:56:56.178878 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/317591a6-758a-45a3-b1a7-ca32fb8f6f34-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "317591a6-758a-45a3-b1a7-ca32fb8f6f34" (UID: "317591a6-758a-45a3-b1a7-ca32fb8f6f34"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:56:56 crc kubenswrapper[4580]: I0321 04:56:56.179202 4580 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/317591a6-758a-45a3-b1a7-ca32fb8f6f34-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:56 crc kubenswrapper[4580]: I0321 04:56:56.189032 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/317591a6-758a-45a3-b1a7-ca32fb8f6f34-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "317591a6-758a-45a3-b1a7-ca32fb8f6f34" (UID: "317591a6-758a-45a3-b1a7-ca32fb8f6f34"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:56 crc kubenswrapper[4580]: I0321 04:56:56.280692 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/317591a6-758a-45a3-b1a7-ca32fb8f6f34-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:56 crc kubenswrapper[4580]: I0321 04:56:56.690330 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"317591a6-758a-45a3-b1a7-ca32fb8f6f34","Type":"ContainerDied","Data":"a7ddb9e4b2e44cf712dbaf75239c7e2fb86c3120d02d0150f61e3fc6a8ad234b"} Mar 21 04:56:56 crc kubenswrapper[4580]: I0321 04:56:56.690397 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7ddb9e4b2e44cf712dbaf75239c7e2fb86c3120d02d0150f61e3fc6a8ad234b" Mar 21 04:56:56 crc kubenswrapper[4580]: I0321 04:56:56.690468 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:56:57 crc kubenswrapper[4580]: I0321 04:56:57.700681 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kpsb" event={"ID":"484933df-fe17-42ec-99da-d1187d674051","Type":"ContainerStarted","Data":"fc980afd46cf9d25cf756a6876af0ccecbed294fda22f489208963d9bc262001"} Mar 21 04:56:58 crc kubenswrapper[4580]: I0321 04:56:58.732527 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4kpsb" podStartSLOduration=8.220880195 podStartE2EDuration="1m6.73249481s" podCreationTimestamp="2026-03-21 04:55:52 +0000 UTC" firstStartedPulling="2026-03-21 04:55:58.172034311 +0000 UTC m=+263.254617939" lastFinishedPulling="2026-03-21 04:56:56.683648926 +0000 UTC m=+321.766232554" observedRunningTime="2026-03-21 04:56:58.728098117 +0000 UTC m=+323.810681755" watchObservedRunningTime="2026-03-21 04:56:58.73249481 +0000 UTC m=+323.815078438" Mar 21 04:57:00 crc kubenswrapper[4580]: I0321 04:57:00.288912 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-4ffj8" Mar 21 04:57:01 crc kubenswrapper[4580]: I0321 04:57:01.836845 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" podUID="329e99ed-cd13-46e2-af1f-3e2bc9eb692d" containerName="oauth-openshift" containerID="cri-o://48bb1e8f8ad1bd53e4d3b9e166e5caf3d041f9043411e1bb80f8e1802fd5c3ab" gracePeriod=15 Mar 21 04:57:02 crc kubenswrapper[4580]: I0321 04:57:02.578695 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h"] Mar 21 04:57:02 crc kubenswrapper[4580]: I0321 04:57:02.579331 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" podUID="6fe6e371-faf7-4749-bcab-f7c196f5d080" containerName="controller-manager" containerID="cri-o://2fed3f0eb2a89d15bc3b276cad4e08c741a12c57e0f629a5f8e92f6b77199f01" gracePeriod=30 Mar 21 04:57:02 crc kubenswrapper[4580]: I0321 04:57:02.678614 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq"] Mar 21 04:57:02 crc kubenswrapper[4580]: I0321 04:57:02.678901 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" podUID="b9935eb9-b708-4943-a85a-50fe7cbd6777" containerName="route-controller-manager" containerID="cri-o://685894d22482b0b37f9c59de46b6829b0b7df9fcb555b29653219f4a673e05c1" gracePeriod=30 Mar 21 04:57:02 crc kubenswrapper[4580]: I0321 04:57:02.734521 4580 generic.go:334] "Generic (PLEG): container finished" podID="329e99ed-cd13-46e2-af1f-3e2bc9eb692d" containerID="48bb1e8f8ad1bd53e4d3b9e166e5caf3d041f9043411e1bb80f8e1802fd5c3ab" exitCode=0 Mar 21 04:57:02 crc kubenswrapper[4580]: I0321 04:57:02.734587 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" event={"ID":"329e99ed-cd13-46e2-af1f-3e2bc9eb692d","Type":"ContainerDied","Data":"48bb1e8f8ad1bd53e4d3b9e166e5caf3d041f9043411e1bb80f8e1802fd5c3ab"} Mar 21 04:57:03 crc kubenswrapper[4580]: I0321 04:57:03.723266 4580 patch_prober.go:28] interesting pod/controller-manager-77d6c4bb9c-l5w4h container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Mar 21 04:57:03 crc kubenswrapper[4580]: I0321 04:57:03.723721 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" podUID="6fe6e371-faf7-4749-bcab-f7c196f5d080" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Mar 21 04:57:03 crc kubenswrapper[4580]: I0321 04:57:03.747708 4580 generic.go:334] "Generic (PLEG): container finished" podID="b9935eb9-b708-4943-a85a-50fe7cbd6777" containerID="685894d22482b0b37f9c59de46b6829b0b7df9fcb555b29653219f4a673e05c1" exitCode=0 Mar 21 04:57:03 crc kubenswrapper[4580]: I0321 04:57:03.747843 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" event={"ID":"b9935eb9-b708-4943-a85a-50fe7cbd6777","Type":"ContainerDied","Data":"685894d22482b0b37f9c59de46b6829b0b7df9fcb555b29653219f4a673e05c1"} Mar 21 04:57:03 crc kubenswrapper[4580]: I0321 04:57:03.750595 4580 generic.go:334] "Generic (PLEG): container finished" podID="6fe6e371-faf7-4749-bcab-f7c196f5d080" containerID="2fed3f0eb2a89d15bc3b276cad4e08c741a12c57e0f629a5f8e92f6b77199f01" exitCode=0 Mar 21 04:57:03 crc kubenswrapper[4580]: I0321 04:57:03.750646 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" event={"ID":"6fe6e371-faf7-4749-bcab-f7c196f5d080","Type":"ContainerDied","Data":"2fed3f0eb2a89d15bc3b276cad4e08c741a12c57e0f629a5f8e92f6b77199f01"} Mar 21 04:57:03 crc kubenswrapper[4580]: I0321 04:57:03.775006 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:57:03 crc kubenswrapper[4580]: I0321 04:57:03.775243 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.221919 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.271847 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.320912 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7594c85d6d-xl87x"] Mar 21 04:57:04 crc kubenswrapper[4580]: E0321 04:57:04.321983 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329e99ed-cd13-46e2-af1f-3e2bc9eb692d" containerName="oauth-openshift" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.322011 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="329e99ed-cd13-46e2-af1f-3e2bc9eb692d" containerName="oauth-openshift" Mar 21 04:57:04 crc kubenswrapper[4580]: E0321 04:57:04.322027 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317591a6-758a-45a3-b1a7-ca32fb8f6f34" containerName="pruner" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.322035 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="317591a6-758a-45a3-b1a7-ca32fb8f6f34" containerName="pruner" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.322372 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="317591a6-758a-45a3-b1a7-ca32fb8f6f34" containerName="pruner" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.322389 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="329e99ed-cd13-46e2-af1f-3e2bc9eb692d" containerName="oauth-openshift" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.323461 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.363741 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7594c85d6d-xl87x"] Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.398410 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-ocp-branding-template\") pod \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.398547 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-serving-cert\") pod \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.398604 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-audit-dir\") pod \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.398646 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-login\") pod \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.398685 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-provider-selection\") pod \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.398773 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-error\") pod \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.398826 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-audit-policies\") pod \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.398868 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-session\") pod \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.398905 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-idp-0-file-data\") pod \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.398926 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-cliconfig\") pod \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.398953 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp55m\" (UniqueName: \"kubernetes.io/projected/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-kube-api-access-vp55m\") pod \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.398999 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-trusted-ca-bundle\") pod \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.399028 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-service-ca\") pod \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.399062 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-router-certs\") pod \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\" (UID: \"329e99ed-cd13-46e2-af1f-3e2bc9eb692d\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.399241 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-service-ca\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.399279 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-user-template-login\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.399302 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.399333 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.399389 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.399419 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-session\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.399455 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-audit-dir\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.399480 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.399504 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.399532 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-audit-policies\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.399555 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-router-certs\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.399581 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.399986 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jb6l\" (UniqueName: \"kubernetes.io/projected/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-kube-api-access-8jb6l\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.400070 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-user-template-error\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.412949 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "329e99ed-cd13-46e2-af1f-3e2bc9eb692d" (UID: "329e99ed-cd13-46e2-af1f-3e2bc9eb692d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.416256 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "329e99ed-cd13-46e2-af1f-3e2bc9eb692d" (UID: "329e99ed-cd13-46e2-af1f-3e2bc9eb692d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.416903 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "329e99ed-cd13-46e2-af1f-3e2bc9eb692d" (UID: "329e99ed-cd13-46e2-af1f-3e2bc9eb692d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.417029 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "329e99ed-cd13-46e2-af1f-3e2bc9eb692d" (UID: "329e99ed-cd13-46e2-af1f-3e2bc9eb692d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.418358 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "329e99ed-cd13-46e2-af1f-3e2bc9eb692d" (UID: "329e99ed-cd13-46e2-af1f-3e2bc9eb692d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.462535 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-kube-api-access-vp55m" (OuterVolumeSpecName: "kube-api-access-vp55m") pod "329e99ed-cd13-46e2-af1f-3e2bc9eb692d" (UID: "329e99ed-cd13-46e2-af1f-3e2bc9eb692d"). InnerVolumeSpecName "kube-api-access-vp55m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.463353 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "329e99ed-cd13-46e2-af1f-3e2bc9eb692d" (UID: "329e99ed-cd13-46e2-af1f-3e2bc9eb692d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.464097 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "329e99ed-cd13-46e2-af1f-3e2bc9eb692d" (UID: "329e99ed-cd13-46e2-af1f-3e2bc9eb692d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.464254 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "329e99ed-cd13-46e2-af1f-3e2bc9eb692d" (UID: "329e99ed-cd13-46e2-af1f-3e2bc9eb692d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.467122 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "329e99ed-cd13-46e2-af1f-3e2bc9eb692d" (UID: "329e99ed-cd13-46e2-af1f-3e2bc9eb692d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.472712 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "329e99ed-cd13-46e2-af1f-3e2bc9eb692d" (UID: "329e99ed-cd13-46e2-af1f-3e2bc9eb692d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.475525 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "329e99ed-cd13-46e2-af1f-3e2bc9eb692d" (UID: "329e99ed-cd13-46e2-af1f-3e2bc9eb692d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.476606 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "329e99ed-cd13-46e2-af1f-3e2bc9eb692d" (UID: "329e99ed-cd13-46e2-af1f-3e2bc9eb692d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.479836 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "329e99ed-cd13-46e2-af1f-3e2bc9eb692d" (UID: "329e99ed-cd13-46e2-af1f-3e2bc9eb692d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.501927 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-audit-dir\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502014 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502055 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502097 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-audit-policies\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502129 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-router-certs\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502159 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jb6l\" (UniqueName: \"kubernetes.io/projected/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-kube-api-access-8jb6l\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502184 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502213 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-user-template-error\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502241 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-service-ca\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502277 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-user-template-login\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502302 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502338 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502383 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502411 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-session\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502489 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502546 4580 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502563 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502576 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502590 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502603 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp55m\" (UniqueName: \"kubernetes.io/projected/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-kube-api-access-vp55m\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502617 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502633 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502647 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502661 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502675 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502687 4580 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502699 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.502713 4580 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/329e99ed-cd13-46e2-af1f-3e2bc9eb692d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.506878 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-audit-dir\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.507060 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-session\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.507919 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-audit-policies\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.508472 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.509633 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-service-ca\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.512176 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.512941 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.515670 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-router-certs\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.516121 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.516230 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.516142 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-user-template-login\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.516536 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-user-template-error\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.518582 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.527937 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jb6l\" (UniqueName: \"kubernetes.io/projected/bc6cc155-142d-4234-b5a9-f3fa7c38aa7e-kube-api-access-8jb6l\") pod \"oauth-openshift-7594c85d6d-xl87x\" (UID: \"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e\") " pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.618995 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.660248 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.706700 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2pdn\" (UniqueName: \"kubernetes.io/projected/b9935eb9-b708-4943-a85a-50fe7cbd6777-kube-api-access-j2pdn\") pod \"b9935eb9-b708-4943-a85a-50fe7cbd6777\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.706837 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9935eb9-b708-4943-a85a-50fe7cbd6777-config\") pod \"b9935eb9-b708-4943-a85a-50fe7cbd6777\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.706957 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9935eb9-b708-4943-a85a-50fe7cbd6777-serving-cert\") pod \"b9935eb9-b708-4943-a85a-50fe7cbd6777\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.706991 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9935eb9-b708-4943-a85a-50fe7cbd6777-client-ca\") pod \"b9935eb9-b708-4943-a85a-50fe7cbd6777\" (UID: \"b9935eb9-b708-4943-a85a-50fe7cbd6777\") " Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.710062 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9935eb9-b708-4943-a85a-50fe7cbd6777-config" (OuterVolumeSpecName: "config") pod "b9935eb9-b708-4943-a85a-50fe7cbd6777" (UID: "b9935eb9-b708-4943-a85a-50fe7cbd6777"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.715324 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9935eb9-b708-4943-a85a-50fe7cbd6777-client-ca" (OuterVolumeSpecName: "client-ca") pod "b9935eb9-b708-4943-a85a-50fe7cbd6777" (UID: "b9935eb9-b708-4943-a85a-50fe7cbd6777"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.742194 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9935eb9-b708-4943-a85a-50fe7cbd6777-kube-api-access-j2pdn" (OuterVolumeSpecName: "kube-api-access-j2pdn") pod "b9935eb9-b708-4943-a85a-50fe7cbd6777" (UID: "b9935eb9-b708-4943-a85a-50fe7cbd6777"). InnerVolumeSpecName "kube-api-access-j2pdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.742359 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9935eb9-b708-4943-a85a-50fe7cbd6777-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b9935eb9-b708-4943-a85a-50fe7cbd6777" (UID: "b9935eb9-b708-4943-a85a-50fe7cbd6777"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.779777 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77nmx" event={"ID":"37b3e873-7ca5-4413-9998-6aaf824d6cd7","Type":"ContainerStarted","Data":"8806b5164dc3cf5c683b86feded2e3841b19e481958f8f1de36e749a547fe9b5"} Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.803944 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" event={"ID":"329e99ed-cd13-46e2-af1f-3e2bc9eb692d","Type":"ContainerDied","Data":"d0eb199c5c0ada23797d0934b6ba56bdadeeb452c57aa0750d3344ae67df2267"} Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.804430 4580 scope.go:117] "RemoveContainer" containerID="48bb1e8f8ad1bd53e4d3b9e166e5caf3d041f9043411e1bb80f8e1802fd5c3ab" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.805062 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kmfk5" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.809012 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2pdn\" (UniqueName: \"kubernetes.io/projected/b9935eb9-b708-4943-a85a-50fe7cbd6777-kube-api-access-j2pdn\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.809049 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9935eb9-b708-4943-a85a-50fe7cbd6777-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.809063 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9935eb9-b708-4943-a85a-50fe7cbd6777-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.809079 4580 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9935eb9-b708-4943-a85a-50fe7cbd6777-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.822240 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ns8gg" event={"ID":"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c","Type":"ContainerStarted","Data":"8967516a25d13542594605c5e175e5e2ad35bd1f4463bf5bf652881a8e972782"} Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.831409 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ppjj" event={"ID":"c002830b-7ac1-4912-9b31-bad37ac63104","Type":"ContainerStarted","Data":"e9b886a4941199d457f32c72a0d33cc8b964cf3a922eb434b07fa8d2c8e5f014"} Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.838638 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.839048 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq" event={"ID":"b9935eb9-b708-4943-a85a-50fe7cbd6777","Type":"ContainerDied","Data":"3a4d8ec9c1c850347a155ab721050a816e7ed7c8d294f46a0cc24edc1ab26285"} Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.895898 4580 scope.go:117] "RemoveContainer" containerID="685894d22482b0b37f9c59de46b6829b0b7df9fcb555b29653219f4a673e05c1" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.896385 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.933462 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.982024 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq"] Mar 21 04:57:04 crc kubenswrapper[4580]: I0321 04:57:04.983481 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7c7c868c-7dmjq"] Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.011257 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kmfk5"] Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.020384 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-client-ca\") pod \"6fe6e371-faf7-4749-bcab-f7c196f5d080\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.020485 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-proxy-ca-bundles\") pod \"6fe6e371-faf7-4749-bcab-f7c196f5d080\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.020539 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-config\") pod \"6fe6e371-faf7-4749-bcab-f7c196f5d080\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.020594 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fe6e371-faf7-4749-bcab-f7c196f5d080-serving-cert\") pod \"6fe6e371-faf7-4749-bcab-f7c196f5d080\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.020658 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx65g\" (UniqueName: \"kubernetes.io/projected/6fe6e371-faf7-4749-bcab-f7c196f5d080-kube-api-access-dx65g\") pod \"6fe6e371-faf7-4749-bcab-f7c196f5d080\" (UID: \"6fe6e371-faf7-4749-bcab-f7c196f5d080\") " Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.022731 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-client-ca" (OuterVolumeSpecName: "client-ca") pod "6fe6e371-faf7-4749-bcab-f7c196f5d080" (UID: "6fe6e371-faf7-4749-bcab-f7c196f5d080"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.022892 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kmfk5"] Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.023216 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6fe6e371-faf7-4749-bcab-f7c196f5d080" (UID: "6fe6e371-faf7-4749-bcab-f7c196f5d080"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.023279 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-config" (OuterVolumeSpecName: "config") pod "6fe6e371-faf7-4749-bcab-f7c196f5d080" (UID: "6fe6e371-faf7-4749-bcab-f7c196f5d080"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.026959 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe6e371-faf7-4749-bcab-f7c196f5d080-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6fe6e371-faf7-4749-bcab-f7c196f5d080" (UID: "6fe6e371-faf7-4749-bcab-f7c196f5d080"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.030033 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe6e371-faf7-4749-bcab-f7c196f5d080-kube-api-access-dx65g" (OuterVolumeSpecName: "kube-api-access-dx65g") pod "6fe6e371-faf7-4749-bcab-f7c196f5d080" (UID: "6fe6e371-faf7-4749-bcab-f7c196f5d080"). InnerVolumeSpecName "kube-api-access-dx65g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.123157 4580 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.123216 4580 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.123234 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe6e371-faf7-4749-bcab-f7c196f5d080-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.123245 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fe6e371-faf7-4749-bcab-f7c196f5d080-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.123259 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx65g\" (UniqueName: \"kubernetes.io/projected/6fe6e371-faf7-4749-bcab-f7c196f5d080-kube-api-access-dx65g\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.181877 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7594c85d6d-xl87x"] Mar 21 04:57:05 crc kubenswrapper[4580]: W0321 04:57:05.182329 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc6cc155_142d_4234_b5a9_f3fa7c38aa7e.slice/crio-e0773176afa01cf656e72aaff48e8361f6decaf3ad7b1d9a341875682fec6044 WatchSource:0}: Error finding container e0773176afa01cf656e72aaff48e8361f6decaf3ad7b1d9a341875682fec6044: Status 404 returned error can't find the container with id e0773176afa01cf656e72aaff48e8361f6decaf3ad7b1d9a341875682fec6044 Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.626235 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="329e99ed-cd13-46e2-af1f-3e2bc9eb692d" path="/var/lib/kubelet/pods/329e99ed-cd13-46e2-af1f-3e2bc9eb692d/volumes" Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.627763 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9935eb9-b708-4943-a85a-50fe7cbd6777" path="/var/lib/kubelet/pods/b9935eb9-b708-4943-a85a-50fe7cbd6777/volumes" Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.853017 4580 generic.go:334] "Generic (PLEG): container finished" podID="bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" containerID="8967516a25d13542594605c5e175e5e2ad35bd1f4463bf5bf652881a8e972782" exitCode=0 Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.853106 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ns8gg" event={"ID":"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c","Type":"ContainerDied","Data":"8967516a25d13542594605c5e175e5e2ad35bd1f4463bf5bf652881a8e972782"} Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.856212 4580 generic.go:334] "Generic (PLEG): container finished" podID="c002830b-7ac1-4912-9b31-bad37ac63104" containerID="e9b886a4941199d457f32c72a0d33cc8b964cf3a922eb434b07fa8d2c8e5f014" exitCode=0 Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.856279 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ppjj" event={"ID":"c002830b-7ac1-4912-9b31-bad37ac63104","Type":"ContainerDied","Data":"e9b886a4941199d457f32c72a0d33cc8b964cf3a922eb434b07fa8d2c8e5f014"} Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.860014 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.860073 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h" event={"ID":"6fe6e371-faf7-4749-bcab-f7c196f5d080","Type":"ContainerDied","Data":"eca1018e908d95a8aa144b56c3c3d170064ca50e3809620f1546cdb3ef67ef1d"} Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.860126 4580 scope.go:117] "RemoveContainer" containerID="2fed3f0eb2a89d15bc3b276cad4e08c741a12c57e0f629a5f8e92f6b77199f01" Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.886162 4580 generic.go:334] "Generic (PLEG): container finished" podID="37b3e873-7ca5-4413-9998-6aaf824d6cd7" containerID="8806b5164dc3cf5c683b86feded2e3841b19e481958f8f1de36e749a547fe9b5" exitCode=0 Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.886230 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77nmx" event={"ID":"37b3e873-7ca5-4413-9998-6aaf824d6cd7","Type":"ContainerDied","Data":"8806b5164dc3cf5c683b86feded2e3841b19e481958f8f1de36e749a547fe9b5"} Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.895320 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h"] Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.907714 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77d6c4bb9c-l5w4h"] Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.908015 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" event={"ID":"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e","Type":"ContainerStarted","Data":"03a56d9e02ad6a771618f79652f969968709a14c0e0eebd8ebcb840abe3eb1c7"} Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.908083 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" event={"ID":"bc6cc155-142d-4234-b5a9-f3fa7c38aa7e","Type":"ContainerStarted","Data":"e0773176afa01cf656e72aaff48e8361f6decaf3ad7b1d9a341875682fec6044"} Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.908865 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:05 crc kubenswrapper[4580]: I0321 04:57:05.973951 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" podStartSLOduration=29.973927256 podStartE2EDuration="29.973927256s" podCreationTimestamp="2026-03-21 04:56:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:57:05.971604991 +0000 UTC m=+331.054188639" watchObservedRunningTime="2026-03-21 04:57:05.973927256 +0000 UTC m=+331.056510884" Mar 21 04:57:06 crc kubenswrapper[4580]: I0321 04:57:06.310071 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7594c85d6d-xl87x" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.092599 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-764dd66f8c-7q9rm"] Mar 21 04:57:07 crc kubenswrapper[4580]: E0321 04:57:07.093842 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9935eb9-b708-4943-a85a-50fe7cbd6777" containerName="route-controller-manager" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.093867 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9935eb9-b708-4943-a85a-50fe7cbd6777" containerName="route-controller-manager" Mar 21 04:57:07 crc kubenswrapper[4580]: E0321 04:57:07.093889 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe6e371-faf7-4749-bcab-f7c196f5d080" containerName="controller-manager" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.093896 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe6e371-faf7-4749-bcab-f7c196f5d080" containerName="controller-manager" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.097422 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe6e371-faf7-4749-bcab-f7c196f5d080" containerName="controller-manager" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.097518 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9935eb9-b708-4943-a85a-50fe7cbd6777" containerName="route-controller-manager" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.098659 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.106633 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.107561 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.107844 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.108018 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.108736 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.109349 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.117439 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569b474869-hklkv"] Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.118126 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.118306 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.126260 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.126346 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.126369 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.126626 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.128467 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.129001 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.129941 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569b474869-hklkv"] Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.134443 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-764dd66f8c-7q9rm"] Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.269869 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d98e5e9-408e-4d05-807f-8a3993f0da4a-client-ca\") pod \"route-controller-manager-569b474869-hklkv\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.269944 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d98e5e9-408e-4d05-807f-8a3993f0da4a-config\") pod \"route-controller-manager-569b474869-hklkv\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.269962 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wvbx\" (UniqueName: \"kubernetes.io/projected/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-kube-api-access-2wvbx\") pod \"controller-manager-764dd66f8c-7q9rm\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.269988 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d98e5e9-408e-4d05-807f-8a3993f0da4a-serving-cert\") pod \"route-controller-manager-569b474869-hklkv\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.270080 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-config\") pod \"controller-manager-764dd66f8c-7q9rm\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.270128 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-client-ca\") pod \"controller-manager-764dd66f8c-7q9rm\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.270145 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpmj5\" (UniqueName: \"kubernetes.io/projected/0d98e5e9-408e-4d05-807f-8a3993f0da4a-kube-api-access-zpmj5\") pod \"route-controller-manager-569b474869-hklkv\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.270163 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-serving-cert\") pod \"controller-manager-764dd66f8c-7q9rm\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.270184 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-proxy-ca-bundles\") pod \"controller-manager-764dd66f8c-7q9rm\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.372557 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-config\") pod \"controller-manager-764dd66f8c-7q9rm\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.372645 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-client-ca\") pod \"controller-manager-764dd66f8c-7q9rm\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.372678 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpmj5\" (UniqueName: \"kubernetes.io/projected/0d98e5e9-408e-4d05-807f-8a3993f0da4a-kube-api-access-zpmj5\") pod \"route-controller-manager-569b474869-hklkv\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.372712 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-serving-cert\") pod \"controller-manager-764dd66f8c-7q9rm\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.372743 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-proxy-ca-bundles\") pod \"controller-manager-764dd66f8c-7q9rm\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.372811 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d98e5e9-408e-4d05-807f-8a3993f0da4a-client-ca\") pod \"route-controller-manager-569b474869-hklkv\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.372844 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d98e5e9-408e-4d05-807f-8a3993f0da4a-config\") pod \"route-controller-manager-569b474869-hklkv\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.372869 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wvbx\" (UniqueName: \"kubernetes.io/projected/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-kube-api-access-2wvbx\") pod \"controller-manager-764dd66f8c-7q9rm\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.372898 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d98e5e9-408e-4d05-807f-8a3993f0da4a-serving-cert\") pod \"route-controller-manager-569b474869-hklkv\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.374366 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-config\") pod \"controller-manager-764dd66f8c-7q9rm\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.375134 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-proxy-ca-bundles\") pod \"controller-manager-764dd66f8c-7q9rm\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.375454 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d98e5e9-408e-4d05-807f-8a3993f0da4a-client-ca\") pod \"route-controller-manager-569b474869-hklkv\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.375712 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d98e5e9-408e-4d05-807f-8a3993f0da4a-config\") pod \"route-controller-manager-569b474869-hklkv\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.377334 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-client-ca\") pod \"controller-manager-764dd66f8c-7q9rm\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.381473 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-serving-cert\") pod \"controller-manager-764dd66f8c-7q9rm\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.389880 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d98e5e9-408e-4d05-807f-8a3993f0da4a-serving-cert\") pod \"route-controller-manager-569b474869-hklkv\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.403416 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpmj5\" (UniqueName: \"kubernetes.io/projected/0d98e5e9-408e-4d05-807f-8a3993f0da4a-kube-api-access-zpmj5\") pod \"route-controller-manager-569b474869-hklkv\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.403607 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wvbx\" (UniqueName: \"kubernetes.io/projected/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-kube-api-access-2wvbx\") pod \"controller-manager-764dd66f8c-7q9rm\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.419775 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.453633 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.602212 4580 ???:1] "http: TLS handshake error from 192.168.126.11:49320: no serving certificate available for the kubelet" Mar 21 04:57:07 crc kubenswrapper[4580]: I0321 04:57:07.628707 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe6e371-faf7-4749-bcab-f7c196f5d080" path="/var/lib/kubelet/pods/6fe6e371-faf7-4749-bcab-f7c196f5d080/volumes" Mar 21 04:57:08 crc kubenswrapper[4580]: I0321 04:57:08.562108 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-764dd66f8c-7q9rm"] Mar 21 04:57:08 crc kubenswrapper[4580]: I0321 04:57:08.832557 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569b474869-hklkv"] Mar 21 04:57:08 crc kubenswrapper[4580]: I0321 04:57:08.934152 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" event={"ID":"ef8cf0b4-6e6e-4c54-a9af-95b814a84112","Type":"ContainerStarted","Data":"c4dd17aaeb8712c0b17c20bddd30e227356bf39e518bf6d5bb818bb2b3527f83"} Mar 21 04:57:08 crc kubenswrapper[4580]: I0321 04:57:08.935471 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" event={"ID":"0d98e5e9-408e-4d05-807f-8a3993f0da4a","Type":"ContainerStarted","Data":"1af48b33f85b2cc0811694eb381993677e836600c68ba91518d24c7d484f847e"} Mar 21 04:57:09 crc kubenswrapper[4580]: I0321 04:57:09.945967 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ns8gg" event={"ID":"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c","Type":"ContainerStarted","Data":"2906bf6ad16aee4244bf07292c53b40e9340abb3618410bf10d6664a6ee01b68"} Mar 21 04:57:10 crc kubenswrapper[4580]: I0321 04:57:10.980414 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ns8gg" podStartSLOduration=7.660484097 podStartE2EDuration="1m17.980382967s" podCreationTimestamp="2026-03-21 04:55:53 +0000 UTC" firstStartedPulling="2026-03-21 04:55:58.076182625 +0000 UTC m=+263.158766263" lastFinishedPulling="2026-03-21 04:57:08.396081505 +0000 UTC m=+333.478665133" observedRunningTime="2026-03-21 04:57:10.977451675 +0000 UTC m=+336.060035343" watchObservedRunningTime="2026-03-21 04:57:10.980382967 +0000 UTC m=+336.062966595" Mar 21 04:57:12 crc kubenswrapper[4580]: I0321 04:57:12.971107 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq5t4" event={"ID":"1dd3cd12-741f-4993-8b39-994545e15c2c","Type":"ContainerStarted","Data":"8b0aa4352205385a7c96780532b569d97e9b34b41f2b2a81e9907a25bd3dda4a"} Mar 21 04:57:12 crc kubenswrapper[4580]: I0321 04:57:12.973953 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n99sq" event={"ID":"82874992-faa8-4c73-955b-ffe5f02726a7","Type":"ContainerStarted","Data":"4728e5e044a69f7890b69dccb2c0cba61649a5a887999c873e5ca47b245278b4"} Mar 21 04:57:12 crc kubenswrapper[4580]: I0321 04:57:12.975989 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77nmx" event={"ID":"37b3e873-7ca5-4413-9998-6aaf824d6cd7","Type":"ContainerStarted","Data":"0e74c3fade6a8363e1e513360c03f6b271c3fe659dcf4cd9684fc5169792e54d"} Mar 21 04:57:12 crc kubenswrapper[4580]: I0321 04:57:12.979136 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" event={"ID":"0d98e5e9-408e-4d05-807f-8a3993f0da4a","Type":"ContainerStarted","Data":"439ad8c5484cac5a527724ec0a479400abdac6c8b25cd30b1f7946f60267f1c6"} Mar 21 04:57:12 crc kubenswrapper[4580]: I0321 04:57:12.979816 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:12 crc kubenswrapper[4580]: I0321 04:57:12.984963 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ppjj" event={"ID":"c002830b-7ac1-4912-9b31-bad37ac63104","Type":"ContainerStarted","Data":"3e3b9d28a7226fe74b412c9b02ab9c697c7f56a2a79a0d821d7daeefce05559e"} Mar 21 04:57:12 crc kubenswrapper[4580]: I0321 04:57:12.989631 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" event={"ID":"ef8cf0b4-6e6e-4c54-a9af-95b814a84112","Type":"ContainerStarted","Data":"cf80ef1643ade6c5b369aeffe004ea8b4999e03d0d5c42cffb63a08a8273d212"} Mar 21 04:57:12 crc kubenswrapper[4580]: I0321 04:57:12.989951 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:13 crc kubenswrapper[4580]: I0321 04:57:13.028513 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:13 crc kubenswrapper[4580]: I0321 04:57:13.052557 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-77nmx" podStartSLOduration=6.600816854 podStartE2EDuration="1m23.052538151s" podCreationTimestamp="2026-03-21 04:55:50 +0000 UTC" firstStartedPulling="2026-03-21 04:55:54.574239674 +0000 UTC m=+259.656823302" lastFinishedPulling="2026-03-21 04:57:11.025960971 +0000 UTC m=+336.108544599" observedRunningTime="2026-03-21 04:57:13.013735406 +0000 UTC m=+338.096319044" watchObservedRunningTime="2026-03-21 04:57:13.052538151 +0000 UTC m=+338.135121779" Mar 21 04:57:13 crc kubenswrapper[4580]: I0321 04:57:13.083024 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" podStartSLOduration=11.082997512 podStartE2EDuration="11.082997512s" podCreationTimestamp="2026-03-21 04:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:57:13.053272331 +0000 UTC m=+338.135855979" watchObservedRunningTime="2026-03-21 04:57:13.082997512 +0000 UTC m=+338.165581140" Mar 21 04:57:13 crc kubenswrapper[4580]: I0321 04:57:13.085868 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" podStartSLOduration=11.085860853 podStartE2EDuration="11.085860853s" podCreationTimestamp="2026-03-21 04:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:57:13.079723791 +0000 UTC m=+338.162307429" watchObservedRunningTime="2026-03-21 04:57:13.085860853 +0000 UTC m=+338.168444481" Mar 21 04:57:13 crc kubenswrapper[4580]: I0321 04:57:13.092848 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:13 crc kubenswrapper[4580]: I0321 04:57:13.844724 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:57:13 crc kubenswrapper[4580]: I0321 04:57:13.844775 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:57:13 crc kubenswrapper[4580]: I0321 04:57:13.999335 4580 generic.go:334] "Generic (PLEG): container finished" podID="82874992-faa8-4c73-955b-ffe5f02726a7" containerID="4728e5e044a69f7890b69dccb2c0cba61649a5a887999c873e5ca47b245278b4" exitCode=0 Mar 21 04:57:14 crc kubenswrapper[4580]: I0321 04:57:13.999427 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n99sq" event={"ID":"82874992-faa8-4c73-955b-ffe5f02726a7","Type":"ContainerDied","Data":"4728e5e044a69f7890b69dccb2c0cba61649a5a887999c873e5ca47b245278b4"} Mar 21 04:57:14 crc kubenswrapper[4580]: I0321 04:57:14.002982 4580 generic.go:334] "Generic (PLEG): container finished" podID="1dd3cd12-741f-4993-8b39-994545e15c2c" containerID="8b0aa4352205385a7c96780532b569d97e9b34b41f2b2a81e9907a25bd3dda4a" exitCode=0 Mar 21 04:57:14 crc kubenswrapper[4580]: I0321 04:57:14.003078 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq5t4" event={"ID":"1dd3cd12-741f-4993-8b39-994545e15c2c","Type":"ContainerDied","Data":"8b0aa4352205385a7c96780532b569d97e9b34b41f2b2a81e9907a25bd3dda4a"} Mar 21 04:57:14 crc kubenswrapper[4580]: I0321 04:57:14.039403 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6ppjj" podStartSLOduration=9.955828749 podStartE2EDuration="1m24.0393838s" podCreationTimestamp="2026-03-21 04:55:50 +0000 UTC" firstStartedPulling="2026-03-21 04:55:54.618375325 +0000 UTC m=+259.700958943" lastFinishedPulling="2026-03-21 04:57:08.701930366 +0000 UTC m=+333.784513994" observedRunningTime="2026-03-21 04:57:14.038084494 +0000 UTC m=+339.120668132" watchObservedRunningTime="2026-03-21 04:57:14.0393838 +0000 UTC m=+339.121967438" Mar 21 04:57:14 crc kubenswrapper[4580]: I0321 04:57:14.886330 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ns8gg" podUID="bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" containerName="registry-server" probeResult="failure" output=< Mar 21 04:57:14 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 04:57:14 crc kubenswrapper[4580]: > Mar 21 04:57:15 crc kubenswrapper[4580]: I0321 04:57:15.949636 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:57:15 crc kubenswrapper[4580]: I0321 04:57:15.950341 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:57:15 crc kubenswrapper[4580]: I0321 04:57:15.950411 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 04:57:15 crc kubenswrapper[4580]: I0321 04:57:15.951176 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed"} pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:57:15 crc kubenswrapper[4580]: I0321 04:57:15.951227 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" containerID="cri-o://cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed" gracePeriod=600 Mar 21 04:57:16 crc kubenswrapper[4580]: I0321 04:57:16.022728 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567814-8cxbg" event={"ID":"1714688f-61d5-436b-baaf-2668757942fd","Type":"ContainerStarted","Data":"ec455bb30fbb2c325792f041e355d5d7a4ca5f20706914886a644ffc606763cf"} Mar 21 04:57:16 crc kubenswrapper[4580]: I0321 04:57:16.025672 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kk5nv" event={"ID":"9940b0fa-e788-4da2-af4f-da4cdc60f12d","Type":"ContainerStarted","Data":"b7b7210d8eda8a5bc1a45ecc3459e7cbbabe68d78761b7f78455d143a8965598"} Mar 21 04:57:16 crc kubenswrapper[4580]: I0321 04:57:16.029994 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq5t4" event={"ID":"1dd3cd12-741f-4993-8b39-994545e15c2c","Type":"ContainerStarted","Data":"de9b5104d4f276a0bd591e7730a9c3faede1841f2ffa9e8303eed7e6a3b21f59"} Mar 21 04:57:16 crc kubenswrapper[4580]: I0321 04:57:16.032593 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n99sq" event={"ID":"82874992-faa8-4c73-955b-ffe5f02726a7","Type":"ContainerStarted","Data":"92c7faaec51f14fd1e7fdcaf810c546556533e202c39331b82a5f2bfa03c49b4"} Mar 21 04:57:16 crc kubenswrapper[4580]: I0321 04:57:16.034306 4580 generic.go:334] "Generic (PLEG): container finished" podID="bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" containerID="eff74d951ae334154034f0b3dc78f9dfb1f403fefc9feef7a4114c863b320ae1" exitCode=0 Mar 21 04:57:16 crc kubenswrapper[4580]: I0321 04:57:16.034368 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47qgx" event={"ID":"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d","Type":"ContainerDied","Data":"eff74d951ae334154034f0b3dc78f9dfb1f403fefc9feef7a4114c863b320ae1"} Mar 21 04:57:16 crc kubenswrapper[4580]: I0321 04:57:16.039727 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567816-m2qj9" event={"ID":"0cb67fb0-cbe3-47cd-9029-f54e6e74729d","Type":"ContainerStarted","Data":"891881ebecb575e921905eef7c5cc09306edba571f5bd5dd4b455d143e81690a"} Mar 21 04:57:16 crc kubenswrapper[4580]: I0321 04:57:16.109086 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567814-8cxbg" podStartSLOduration=106.171780028 podStartE2EDuration="3m16.109056345s" podCreationTimestamp="2026-03-21 04:54:00 +0000 UTC" firstStartedPulling="2026-03-21 04:55:45.363267693 +0000 UTC m=+250.445851321" lastFinishedPulling="2026-03-21 04:57:15.30054401 +0000 UTC m=+340.383127638" observedRunningTime="2026-03-21 04:57:16.082729669 +0000 UTC m=+341.165313327" watchObservedRunningTime="2026-03-21 04:57:16.109056345 +0000 UTC m=+341.191639973" Mar 21 04:57:16 crc kubenswrapper[4580]: I0321 04:57:16.140480 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567816-m2qj9" podStartSLOduration=2.016102134 podStartE2EDuration="1m16.140459742s" podCreationTimestamp="2026-03-21 04:56:00 +0000 UTC" firstStartedPulling="2026-03-21 04:56:01.234555644 +0000 UTC m=+266.317139262" lastFinishedPulling="2026-03-21 04:57:15.358913242 +0000 UTC m=+340.441496870" observedRunningTime="2026-03-21 04:57:16.140111523 +0000 UTC m=+341.222695151" watchObservedRunningTime="2026-03-21 04:57:16.140459742 +0000 UTC m=+341.223043370" Mar 21 04:57:16 crc kubenswrapper[4580]: I0321 04:57:16.171572 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n99sq" podStartSLOduration=4.801304334 podStartE2EDuration="1m27.171542022s" podCreationTimestamp="2026-03-21 04:55:49 +0000 UTC" firstStartedPulling="2026-03-21 04:55:53.276851481 +0000 UTC m=+258.359435109" lastFinishedPulling="2026-03-21 04:57:15.647089169 +0000 UTC m=+340.729672797" observedRunningTime="2026-03-21 04:57:16.168720733 +0000 UTC m=+341.251304381" watchObservedRunningTime="2026-03-21 04:57:16.171542022 +0000 UTC m=+341.254125650" Mar 21 04:57:16 crc kubenswrapper[4580]: I0321 04:57:16.235629 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jq5t4" podStartSLOduration=3.836049523 podStartE2EDuration="1m26.235606253s" podCreationTimestamp="2026-03-21 04:55:50 +0000 UTC" firstStartedPulling="2026-03-21 04:55:53.339058818 +0000 UTC m=+258.421642446" lastFinishedPulling="2026-03-21 04:57:15.738615548 +0000 UTC m=+340.821199176" observedRunningTime="2026-03-21 04:57:16.200870961 +0000 UTC m=+341.283454589" watchObservedRunningTime="2026-03-21 04:57:16.235606253 +0000 UTC m=+341.318189881" Mar 21 04:57:17 crc kubenswrapper[4580]: I0321 04:57:17.013716 4580 csr.go:261] certificate signing request csr-c7xwr is approved, waiting to be issued Mar 21 04:57:17 crc kubenswrapper[4580]: I0321 04:57:17.025228 4580 csr.go:257] certificate signing request csr-c7xwr is issued Mar 21 04:57:17 crc kubenswrapper[4580]: I0321 04:57:17.049270 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerID="cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed" exitCode=0 Mar 21 04:57:17 crc kubenswrapper[4580]: I0321 04:57:17.049342 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerDied","Data":"cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed"} Mar 21 04:57:17 crc kubenswrapper[4580]: I0321 04:57:17.049380 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"4c2d2ca0ada0ea0c349eeeb34497c0724179464955a36c9a80f234b23a820b94"} Mar 21 04:57:17 crc kubenswrapper[4580]: I0321 04:57:17.055319 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47qgx" event={"ID":"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d","Type":"ContainerStarted","Data":"fdcb21da2ace935c0abda79e4e71e649535f556182a6b1ea39e256d0ea16c547"} Mar 21 04:57:17 crc kubenswrapper[4580]: I0321 04:57:17.057590 4580 generic.go:334] "Generic (PLEG): container finished" podID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" containerID="b7b7210d8eda8a5bc1a45ecc3459e7cbbabe68d78761b7f78455d143a8965598" exitCode=0 Mar 21 04:57:17 crc kubenswrapper[4580]: I0321 04:57:17.057827 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kk5nv" event={"ID":"9940b0fa-e788-4da2-af4f-da4cdc60f12d","Type":"ContainerDied","Data":"b7b7210d8eda8a5bc1a45ecc3459e7cbbabe68d78761b7f78455d143a8965598"} Mar 21 04:57:17 crc kubenswrapper[4580]: I0321 04:57:17.098367 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-47qgx" podStartSLOduration=6.607048701 podStartE2EDuration="1m25.098335722s" podCreationTimestamp="2026-03-21 04:55:52 +0000 UTC" firstStartedPulling="2026-03-21 04:55:58.056071138 +0000 UTC m=+263.138654766" lastFinishedPulling="2026-03-21 04:57:16.547358159 +0000 UTC m=+341.629941787" observedRunningTime="2026-03-21 04:57:17.096974184 +0000 UTC m=+342.179557822" watchObservedRunningTime="2026-03-21 04:57:17.098335722 +0000 UTC m=+342.180919350" Mar 21 04:57:18 crc kubenswrapper[4580]: I0321 04:57:18.029455 4580 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-12 02:46:37.429002234 +0000 UTC Mar 21 04:57:18 crc kubenswrapper[4580]: I0321 04:57:18.030021 4580 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7125h49m19.398986853s for next certificate rotation Mar 21 04:57:18 crc kubenswrapper[4580]: I0321 04:57:18.086618 4580 generic.go:334] "Generic (PLEG): container finished" podID="0cb67fb0-cbe3-47cd-9029-f54e6e74729d" containerID="891881ebecb575e921905eef7c5cc09306edba571f5bd5dd4b455d143e81690a" exitCode=0 Mar 21 04:57:18 crc kubenswrapper[4580]: I0321 04:57:18.087017 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567816-m2qj9" event={"ID":"0cb67fb0-cbe3-47cd-9029-f54e6e74729d","Type":"ContainerDied","Data":"891881ebecb575e921905eef7c5cc09306edba571f5bd5dd4b455d143e81690a"} Mar 21 04:57:18 crc kubenswrapper[4580]: I0321 04:57:18.090304 4580 generic.go:334] "Generic (PLEG): container finished" podID="1714688f-61d5-436b-baaf-2668757942fd" containerID="ec455bb30fbb2c325792f041e355d5d7a4ca5f20706914886a644ffc606763cf" exitCode=0 Mar 21 04:57:18 crc kubenswrapper[4580]: I0321 04:57:18.090371 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567814-8cxbg" event={"ID":"1714688f-61d5-436b-baaf-2668757942fd","Type":"ContainerDied","Data":"ec455bb30fbb2c325792f041e355d5d7a4ca5f20706914886a644ffc606763cf"} Mar 21 04:57:19 crc kubenswrapper[4580]: I0321 04:57:19.031025 4580 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-24 14:53:35.580392005 +0000 UTC Mar 21 04:57:19 crc kubenswrapper[4580]: I0321 04:57:19.031463 4580 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5961h56m16.548932936s for next certificate rotation Mar 21 04:57:19 crc kubenswrapper[4580]: I0321 04:57:19.100241 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kk5nv" event={"ID":"9940b0fa-e788-4da2-af4f-da4cdc60f12d","Type":"ContainerStarted","Data":"8b056b7426c8b8da087bba83b7bdcf3efa8446cc8e7a0d2e9bb1c7e7ab8b7ef6"} Mar 21 04:57:19 crc kubenswrapper[4580]: I0321 04:57:19.126260 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kk5nv" podStartSLOduration=6.304991755 podStartE2EDuration="1m26.126234919s" podCreationTimestamp="2026-03-21 04:55:53 +0000 UTC" firstStartedPulling="2026-03-21 04:55:58.090607283 +0000 UTC m=+263.173190911" lastFinishedPulling="2026-03-21 04:57:17.911850447 +0000 UTC m=+342.994434075" observedRunningTime="2026-03-21 04:57:19.12482996 +0000 UTC m=+344.207413588" watchObservedRunningTime="2026-03-21 04:57:19.126234919 +0000 UTC m=+344.208818557" Mar 21 04:57:19 crc kubenswrapper[4580]: I0321 04:57:19.505649 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-m2qj9" Mar 21 04:57:19 crc kubenswrapper[4580]: I0321 04:57:19.569156 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpbx7\" (UniqueName: \"kubernetes.io/projected/0cb67fb0-cbe3-47cd-9029-f54e6e74729d-kube-api-access-jpbx7\") pod \"0cb67fb0-cbe3-47cd-9029-f54e6e74729d\" (UID: \"0cb67fb0-cbe3-47cd-9029-f54e6e74729d\") " Mar 21 04:57:19 crc kubenswrapper[4580]: I0321 04:57:19.592114 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb67fb0-cbe3-47cd-9029-f54e6e74729d-kube-api-access-jpbx7" (OuterVolumeSpecName: "kube-api-access-jpbx7") pod "0cb67fb0-cbe3-47cd-9029-f54e6e74729d" (UID: "0cb67fb0-cbe3-47cd-9029-f54e6e74729d"). InnerVolumeSpecName "kube-api-access-jpbx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:19 crc kubenswrapper[4580]: I0321 04:57:19.670407 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpbx7\" (UniqueName: \"kubernetes.io/projected/0cb67fb0-cbe3-47cd-9029-f54e6e74729d-kube-api-access-jpbx7\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:19 crc kubenswrapper[4580]: I0321 04:57:19.740754 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-8cxbg" Mar 21 04:57:19 crc kubenswrapper[4580]: I0321 04:57:19.873336 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7njkm\" (UniqueName: \"kubernetes.io/projected/1714688f-61d5-436b-baaf-2668757942fd-kube-api-access-7njkm\") pod \"1714688f-61d5-436b-baaf-2668757942fd\" (UID: \"1714688f-61d5-436b-baaf-2668757942fd\") " Mar 21 04:57:19 crc kubenswrapper[4580]: I0321 04:57:19.877492 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1714688f-61d5-436b-baaf-2668757942fd-kube-api-access-7njkm" (OuterVolumeSpecName: "kube-api-access-7njkm") pod "1714688f-61d5-436b-baaf-2668757942fd" (UID: "1714688f-61d5-436b-baaf-2668757942fd"). InnerVolumeSpecName "kube-api-access-7njkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:19 crc kubenswrapper[4580]: I0321 04:57:19.975178 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7njkm\" (UniqueName: \"kubernetes.io/projected/1714688f-61d5-436b-baaf-2668757942fd-kube-api-access-7njkm\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.043441 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.043523 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.100160 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.109929 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567814-8cxbg" event={"ID":"1714688f-61d5-436b-baaf-2668757942fd","Type":"ContainerDied","Data":"2e83675d6593419ae3dcd0918b0bacf5852f8886683253cf706a71bf0f7992c8"} Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.109967 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e83675d6593419ae3dcd0918b0bacf5852f8886683253cf706a71bf0f7992c8" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.109977 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-8cxbg" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.111934 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567816-m2qj9" event={"ID":"0cb67fb0-cbe3-47cd-9029-f54e6e74729d","Type":"ContainerDied","Data":"715555ab4a05645e44b374807e00f38c00af61d4dbc00b23ee7eda184d2f474c"} Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.111956 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="715555ab4a05645e44b374807e00f38c00af61d4dbc00b23ee7eda184d2f474c" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.111985 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-m2qj9" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.160914 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.447155 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.447616 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.491266 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.659236 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.659297 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.698596 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.929803 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.929844 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:57:20 crc kubenswrapper[4580]: I0321 04:57:20.971982 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:57:21 crc kubenswrapper[4580]: I0321 04:57:21.168170 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:57:21 crc kubenswrapper[4580]: I0321 04:57:21.168556 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:57:21 crc kubenswrapper[4580]: I0321 04:57:21.189374 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:57:22 crc kubenswrapper[4580]: I0321 04:57:22.576609 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-764dd66f8c-7q9rm"] Mar 21 04:57:22 crc kubenswrapper[4580]: I0321 04:57:22.577432 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" podUID="ef8cf0b4-6e6e-4c54-a9af-95b814a84112" containerName="controller-manager" containerID="cri-o://cf80ef1643ade6c5b369aeffe004ea8b4999e03d0d5c42cffb63a08a8273d212" gracePeriod=30 Mar 21 04:57:22 crc kubenswrapper[4580]: I0321 04:57:22.620544 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569b474869-hklkv"] Mar 21 04:57:22 crc kubenswrapper[4580]: I0321 04:57:22.620953 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" podUID="0d98e5e9-408e-4d05-807f-8a3993f0da4a" containerName="route-controller-manager" containerID="cri-o://439ad8c5484cac5a527724ec0a479400abdac6c8b25cd30b1f7946f60267f1c6" gracePeriod=30 Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.134742 4580 generic.go:334] "Generic (PLEG): container finished" podID="0d98e5e9-408e-4d05-807f-8a3993f0da4a" containerID="439ad8c5484cac5a527724ec0a479400abdac6c8b25cd30b1f7946f60267f1c6" exitCode=0 Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.134824 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" event={"ID":"0d98e5e9-408e-4d05-807f-8a3993f0da4a","Type":"ContainerDied","Data":"439ad8c5484cac5a527724ec0a479400abdac6c8b25cd30b1f7946f60267f1c6"} Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.137069 4580 generic.go:334] "Generic (PLEG): container finished" podID="ef8cf0b4-6e6e-4c54-a9af-95b814a84112" containerID="cf80ef1643ade6c5b369aeffe004ea8b4999e03d0d5c42cffb63a08a8273d212" exitCode=0 Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.137119 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" event={"ID":"ef8cf0b4-6e6e-4c54-a9af-95b814a84112","Type":"ContainerDied","Data":"cf80ef1643ade6c5b369aeffe004ea8b4999e03d0d5c42cffb63a08a8273d212"} Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.193766 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.237151 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.320455 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpmj5\" (UniqueName: \"kubernetes.io/projected/0d98e5e9-408e-4d05-807f-8a3993f0da4a-kube-api-access-zpmj5\") pod \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.320520 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-serving-cert\") pod \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.320559 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d98e5e9-408e-4d05-807f-8a3993f0da4a-config\") pod \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.320591 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-config\") pod \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.320640 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-proxy-ca-bundles\") pod \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.320676 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wvbx\" (UniqueName: \"kubernetes.io/projected/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-kube-api-access-2wvbx\") pod \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.320752 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-client-ca\") pod \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\" (UID: \"ef8cf0b4-6e6e-4c54-a9af-95b814a84112\") " Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.320772 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d98e5e9-408e-4d05-807f-8a3993f0da4a-serving-cert\") pod \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.320833 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d98e5e9-408e-4d05-807f-8a3993f0da4a-client-ca\") pod \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\" (UID: \"0d98e5e9-408e-4d05-807f-8a3993f0da4a\") " Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.322129 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d98e5e9-408e-4d05-807f-8a3993f0da4a-client-ca" (OuterVolumeSpecName: "client-ca") pod "0d98e5e9-408e-4d05-807f-8a3993f0da4a" (UID: "0d98e5e9-408e-4d05-807f-8a3993f0da4a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.324708 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ef8cf0b4-6e6e-4c54-a9af-95b814a84112" (UID: "ef8cf0b4-6e6e-4c54-a9af-95b814a84112"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.324740 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d98e5e9-408e-4d05-807f-8a3993f0da4a-config" (OuterVolumeSpecName: "config") pod "0d98e5e9-408e-4d05-807f-8a3993f0da4a" (UID: "0d98e5e9-408e-4d05-807f-8a3993f0da4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.325350 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-config" (OuterVolumeSpecName: "config") pod "ef8cf0b4-6e6e-4c54-a9af-95b814a84112" (UID: "ef8cf0b4-6e6e-4c54-a9af-95b814a84112"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.325634 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-client-ca" (OuterVolumeSpecName: "client-ca") pod "ef8cf0b4-6e6e-4c54-a9af-95b814a84112" (UID: "ef8cf0b4-6e6e-4c54-a9af-95b814a84112"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.327318 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d98e5e9-408e-4d05-807f-8a3993f0da4a-kube-api-access-zpmj5" (OuterVolumeSpecName: "kube-api-access-zpmj5") pod "0d98e5e9-408e-4d05-807f-8a3993f0da4a" (UID: "0d98e5e9-408e-4d05-807f-8a3993f0da4a"). InnerVolumeSpecName "kube-api-access-zpmj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.327751 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ef8cf0b4-6e6e-4c54-a9af-95b814a84112" (UID: "ef8cf0b4-6e6e-4c54-a9af-95b814a84112"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.329134 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-kube-api-access-2wvbx" (OuterVolumeSpecName: "kube-api-access-2wvbx") pod "ef8cf0b4-6e6e-4c54-a9af-95b814a84112" (UID: "ef8cf0b4-6e6e-4c54-a9af-95b814a84112"). InnerVolumeSpecName "kube-api-access-2wvbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.330549 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d98e5e9-408e-4d05-807f-8a3993f0da4a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0d98e5e9-408e-4d05-807f-8a3993f0da4a" (UID: "0d98e5e9-408e-4d05-807f-8a3993f0da4a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.423223 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wvbx\" (UniqueName: \"kubernetes.io/projected/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-kube-api-access-2wvbx\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.423266 4580 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.423281 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d98e5e9-408e-4d05-807f-8a3993f0da4a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.423295 4580 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d98e5e9-408e-4d05-807f-8a3993f0da4a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.423308 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpmj5\" (UniqueName: \"kubernetes.io/projected/0d98e5e9-408e-4d05-807f-8a3993f0da4a-kube-api-access-zpmj5\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.423320 4580 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.423332 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d98e5e9-408e-4d05-807f-8a3993f0da4a-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.423344 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.423356 4580 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef8cf0b4-6e6e-4c54-a9af-95b814a84112-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.456127 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6ppjj"] Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.456442 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6ppjj" podUID="c002830b-7ac1-4912-9b31-bad37ac63104" containerName="registry-server" containerID="cri-o://3e3b9d28a7226fe74b412c9b02ab9c697c7f56a2a79a0d821d7daeefce05559e" gracePeriod=2 Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.784829 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.784888 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.828973 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.888264 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.927515 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.988657 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:57:23 crc kubenswrapper[4580]: I0321 04:57:23.988728 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.092594 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bc795cf67-fn9lm"] Mar 21 04:57:24 crc kubenswrapper[4580]: E0321 04:57:24.092918 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb67fb0-cbe3-47cd-9029-f54e6e74729d" containerName="oc" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.092934 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb67fb0-cbe3-47cd-9029-f54e6e74729d" containerName="oc" Mar 21 04:57:24 crc kubenswrapper[4580]: E0321 04:57:24.092961 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d98e5e9-408e-4d05-807f-8a3993f0da4a" containerName="route-controller-manager" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.092967 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d98e5e9-408e-4d05-807f-8a3993f0da4a" containerName="route-controller-manager" Mar 21 04:57:24 crc kubenswrapper[4580]: E0321 04:57:24.092979 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1714688f-61d5-436b-baaf-2668757942fd" containerName="oc" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.092985 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1714688f-61d5-436b-baaf-2668757942fd" containerName="oc" Mar 21 04:57:24 crc kubenswrapper[4580]: E0321 04:57:24.092993 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef8cf0b4-6e6e-4c54-a9af-95b814a84112" containerName="controller-manager" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.092998 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef8cf0b4-6e6e-4c54-a9af-95b814a84112" containerName="controller-manager" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.093126 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef8cf0b4-6e6e-4c54-a9af-95b814a84112" containerName="controller-manager" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.093137 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb67fb0-cbe3-47cd-9029-f54e6e74729d" containerName="oc" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.093151 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="1714688f-61d5-436b-baaf-2668757942fd" containerName="oc" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.093159 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d98e5e9-408e-4d05-807f-8a3993f0da4a" containerName="route-controller-manager" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.093640 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.100369 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5"] Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.101651 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.112415 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5"] Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.118957 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bc795cf67-fn9lm"] Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.135849 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69124a03-2f7c-4852-8c82-610b34bf8ab7-config\") pod \"route-controller-manager-fcbd9cc77-9t7b5\" (UID: \"69124a03-2f7c-4852-8c82-610b34bf8ab7\") " pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.135903 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m7pp\" (UniqueName: \"kubernetes.io/projected/69124a03-2f7c-4852-8c82-610b34bf8ab7-kube-api-access-6m7pp\") pod \"route-controller-manager-fcbd9cc77-9t7b5\" (UID: \"69124a03-2f7c-4852-8c82-610b34bf8ab7\") " pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.135928 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69124a03-2f7c-4852-8c82-610b34bf8ab7-client-ca\") pod \"route-controller-manager-fcbd9cc77-9t7b5\" (UID: \"69124a03-2f7c-4852-8c82-610b34bf8ab7\") " pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.135953 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69124a03-2f7c-4852-8c82-610b34bf8ab7-serving-cert\") pod \"route-controller-manager-fcbd9cc77-9t7b5\" (UID: \"69124a03-2f7c-4852-8c82-610b34bf8ab7\") " pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.135971 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57c3e0b1-582c-402d-a9b6-11eb69505341-serving-cert\") pod \"controller-manager-bc795cf67-fn9lm\" (UID: \"57c3e0b1-582c-402d-a9b6-11eb69505341\") " pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.135988 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57c3e0b1-582c-402d-a9b6-11eb69505341-proxy-ca-bundles\") pod \"controller-manager-bc795cf67-fn9lm\" (UID: \"57c3e0b1-582c-402d-a9b6-11eb69505341\") " pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.136024 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57c3e0b1-582c-402d-a9b6-11eb69505341-config\") pod \"controller-manager-bc795cf67-fn9lm\" (UID: \"57c3e0b1-582c-402d-a9b6-11eb69505341\") " pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.136045 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57c3e0b1-582c-402d-a9b6-11eb69505341-client-ca\") pod \"controller-manager-bc795cf67-fn9lm\" (UID: \"57c3e0b1-582c-402d-a9b6-11eb69505341\") " pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.136070 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5ts7\" (UniqueName: \"kubernetes.io/projected/57c3e0b1-582c-402d-a9b6-11eb69505341-kube-api-access-j5ts7\") pod \"controller-manager-bc795cf67-fn9lm\" (UID: \"57c3e0b1-582c-402d-a9b6-11eb69505341\") " pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.151416 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" event={"ID":"ef8cf0b4-6e6e-4c54-a9af-95b814a84112","Type":"ContainerDied","Data":"c4dd17aaeb8712c0b17c20bddd30e227356bf39e518bf6d5bb818bb2b3527f83"} Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.151464 4580 scope.go:117] "RemoveContainer" containerID="cf80ef1643ade6c5b369aeffe004ea8b4999e03d0d5c42cffb63a08a8273d212" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.151470 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-764dd66f8c-7q9rm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.154770 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" event={"ID":"0d98e5e9-408e-4d05-807f-8a3993f0da4a","Type":"ContainerDied","Data":"1af48b33f85b2cc0811694eb381993677e836600c68ba91518d24c7d484f847e"} Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.154889 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-569b474869-hklkv" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.186815 4580 scope.go:117] "RemoveContainer" containerID="439ad8c5484cac5a527724ec0a479400abdac6c8b25cd30b1f7946f60267f1c6" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.195607 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-764dd66f8c-7q9rm"] Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.202094 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-764dd66f8c-7q9rm"] Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.206917 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569b474869-hklkv"] Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.209490 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569b474869-hklkv"] Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.229905 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.237826 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57c3e0b1-582c-402d-a9b6-11eb69505341-config\") pod \"controller-manager-bc795cf67-fn9lm\" (UID: \"57c3e0b1-582c-402d-a9b6-11eb69505341\") " pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.238052 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57c3e0b1-582c-402d-a9b6-11eb69505341-client-ca\") pod \"controller-manager-bc795cf67-fn9lm\" (UID: \"57c3e0b1-582c-402d-a9b6-11eb69505341\") " pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.238220 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5ts7\" (UniqueName: \"kubernetes.io/projected/57c3e0b1-582c-402d-a9b6-11eb69505341-kube-api-access-j5ts7\") pod \"controller-manager-bc795cf67-fn9lm\" (UID: \"57c3e0b1-582c-402d-a9b6-11eb69505341\") " pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.238445 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m7pp\" (UniqueName: \"kubernetes.io/projected/69124a03-2f7c-4852-8c82-610b34bf8ab7-kube-api-access-6m7pp\") pod \"route-controller-manager-fcbd9cc77-9t7b5\" (UID: \"69124a03-2f7c-4852-8c82-610b34bf8ab7\") " pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.238571 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69124a03-2f7c-4852-8c82-610b34bf8ab7-config\") pod \"route-controller-manager-fcbd9cc77-9t7b5\" (UID: \"69124a03-2f7c-4852-8c82-610b34bf8ab7\") " pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.238699 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69124a03-2f7c-4852-8c82-610b34bf8ab7-client-ca\") pod \"route-controller-manager-fcbd9cc77-9t7b5\" (UID: \"69124a03-2f7c-4852-8c82-610b34bf8ab7\") " pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.238922 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69124a03-2f7c-4852-8c82-610b34bf8ab7-serving-cert\") pod \"route-controller-manager-fcbd9cc77-9t7b5\" (UID: \"69124a03-2f7c-4852-8c82-610b34bf8ab7\") " pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.239101 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57c3e0b1-582c-402d-a9b6-11eb69505341-serving-cert\") pod \"controller-manager-bc795cf67-fn9lm\" (UID: \"57c3e0b1-582c-402d-a9b6-11eb69505341\") " pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.239221 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57c3e0b1-582c-402d-a9b6-11eb69505341-proxy-ca-bundles\") pod \"controller-manager-bc795cf67-fn9lm\" (UID: \"57c3e0b1-582c-402d-a9b6-11eb69505341\") " pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.241440 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57c3e0b1-582c-402d-a9b6-11eb69505341-proxy-ca-bundles\") pod \"controller-manager-bc795cf67-fn9lm\" (UID: \"57c3e0b1-582c-402d-a9b6-11eb69505341\") " pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.241877 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57c3e0b1-582c-402d-a9b6-11eb69505341-client-ca\") pod \"controller-manager-bc795cf67-fn9lm\" (UID: \"57c3e0b1-582c-402d-a9b6-11eb69505341\") " pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.242227 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69124a03-2f7c-4852-8c82-610b34bf8ab7-config\") pod \"route-controller-manager-fcbd9cc77-9t7b5\" (UID: \"69124a03-2f7c-4852-8c82-610b34bf8ab7\") " pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.242509 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57c3e0b1-582c-402d-a9b6-11eb69505341-config\") pod \"controller-manager-bc795cf67-fn9lm\" (UID: \"57c3e0b1-582c-402d-a9b6-11eb69505341\") " pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.245577 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69124a03-2f7c-4852-8c82-610b34bf8ab7-serving-cert\") pod \"route-controller-manager-fcbd9cc77-9t7b5\" (UID: \"69124a03-2f7c-4852-8c82-610b34bf8ab7\") " pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.247855 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57c3e0b1-582c-402d-a9b6-11eb69505341-serving-cert\") pod \"controller-manager-bc795cf67-fn9lm\" (UID: \"57c3e0b1-582c-402d-a9b6-11eb69505341\") " pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.254625 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69124a03-2f7c-4852-8c82-610b34bf8ab7-client-ca\") pod \"route-controller-manager-fcbd9cc77-9t7b5\" (UID: \"69124a03-2f7c-4852-8c82-610b34bf8ab7\") " pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.267918 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m7pp\" (UniqueName: \"kubernetes.io/projected/69124a03-2f7c-4852-8c82-610b34bf8ab7-kube-api-access-6m7pp\") pod \"route-controller-manager-fcbd9cc77-9t7b5\" (UID: \"69124a03-2f7c-4852-8c82-610b34bf8ab7\") " pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.276203 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5ts7\" (UniqueName: \"kubernetes.io/projected/57c3e0b1-582c-402d-a9b6-11eb69505341-kube-api-access-j5ts7\") pod \"controller-manager-bc795cf67-fn9lm\" (UID: \"57c3e0b1-582c-402d-a9b6-11eb69505341\") " pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.407965 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.419365 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.461174 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jq5t4"] Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.461856 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jq5t4" podUID="1dd3cd12-741f-4993-8b39-994545e15c2c" containerName="registry-server" containerID="cri-o://de9b5104d4f276a0bd591e7730a9c3faede1841f2ffa9e8303eed7e6a3b21f59" gracePeriod=2 Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.635675 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.747551 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c002830b-7ac1-4912-9b31-bad37ac63104-utilities\") pod \"c002830b-7ac1-4912-9b31-bad37ac63104\" (UID: \"c002830b-7ac1-4912-9b31-bad37ac63104\") " Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.747611 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6tbg\" (UniqueName: \"kubernetes.io/projected/c002830b-7ac1-4912-9b31-bad37ac63104-kube-api-access-b6tbg\") pod \"c002830b-7ac1-4912-9b31-bad37ac63104\" (UID: \"c002830b-7ac1-4912-9b31-bad37ac63104\") " Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.747659 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c002830b-7ac1-4912-9b31-bad37ac63104-catalog-content\") pod \"c002830b-7ac1-4912-9b31-bad37ac63104\" (UID: \"c002830b-7ac1-4912-9b31-bad37ac63104\") " Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.750255 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c002830b-7ac1-4912-9b31-bad37ac63104-utilities" (OuterVolumeSpecName: "utilities") pod "c002830b-7ac1-4912-9b31-bad37ac63104" (UID: "c002830b-7ac1-4912-9b31-bad37ac63104"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.761154 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c002830b-7ac1-4912-9b31-bad37ac63104-kube-api-access-b6tbg" (OuterVolumeSpecName: "kube-api-access-b6tbg") pod "c002830b-7ac1-4912-9b31-bad37ac63104" (UID: "c002830b-7ac1-4912-9b31-bad37ac63104"). InnerVolumeSpecName "kube-api-access-b6tbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.809106 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c002830b-7ac1-4912-9b31-bad37ac63104-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c002830b-7ac1-4912-9b31-bad37ac63104" (UID: "c002830b-7ac1-4912-9b31-bad37ac63104"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.849134 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c002830b-7ac1-4912-9b31-bad37ac63104-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.849171 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c002830b-7ac1-4912-9b31-bad37ac63104-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:24 crc kubenswrapper[4580]: I0321 04:57:24.849182 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6tbg\" (UniqueName: \"kubernetes.io/projected/c002830b-7ac1-4912-9b31-bad37ac63104-kube-api-access-b6tbg\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.000726 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bc795cf67-fn9lm"] Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.030395 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kk5nv" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" containerName="registry-server" probeResult="failure" output=< Mar 21 04:57:25 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 04:57:25 crc kubenswrapper[4580]: > Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.035454 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.085056 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5"] Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.152508 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlp7m\" (UniqueName: \"kubernetes.io/projected/1dd3cd12-741f-4993-8b39-994545e15c2c-kube-api-access-tlp7m\") pod \"1dd3cd12-741f-4993-8b39-994545e15c2c\" (UID: \"1dd3cd12-741f-4993-8b39-994545e15c2c\") " Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.152678 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dd3cd12-741f-4993-8b39-994545e15c2c-catalog-content\") pod \"1dd3cd12-741f-4993-8b39-994545e15c2c\" (UID: \"1dd3cd12-741f-4993-8b39-994545e15c2c\") " Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.152734 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dd3cd12-741f-4993-8b39-994545e15c2c-utilities\") pod \"1dd3cd12-741f-4993-8b39-994545e15c2c\" (UID: \"1dd3cd12-741f-4993-8b39-994545e15c2c\") " Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.154011 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dd3cd12-741f-4993-8b39-994545e15c2c-utilities" (OuterVolumeSpecName: "utilities") pod "1dd3cd12-741f-4993-8b39-994545e15c2c" (UID: "1dd3cd12-741f-4993-8b39-994545e15c2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.163673 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd3cd12-741f-4993-8b39-994545e15c2c-kube-api-access-tlp7m" (OuterVolumeSpecName: "kube-api-access-tlp7m") pod "1dd3cd12-741f-4993-8b39-994545e15c2c" (UID: "1dd3cd12-741f-4993-8b39-994545e15c2c"). InnerVolumeSpecName "kube-api-access-tlp7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.174593 4580 generic.go:334] "Generic (PLEG): container finished" podID="1dd3cd12-741f-4993-8b39-994545e15c2c" containerID="de9b5104d4f276a0bd591e7730a9c3faede1841f2ffa9e8303eed7e6a3b21f59" exitCode=0 Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.174750 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq5t4" event={"ID":"1dd3cd12-741f-4993-8b39-994545e15c2c","Type":"ContainerDied","Data":"de9b5104d4f276a0bd591e7730a9c3faede1841f2ffa9e8303eed7e6a3b21f59"} Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.174977 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq5t4" event={"ID":"1dd3cd12-741f-4993-8b39-994545e15c2c","Type":"ContainerDied","Data":"65fef62934a61a81e47b3d3e78fbbbd999f553dd7b2e75581edcd2571b9e5b48"} Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.175014 4580 scope.go:117] "RemoveContainer" containerID="de9b5104d4f276a0bd591e7730a9c3faede1841f2ffa9e8303eed7e6a3b21f59" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.174849 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jq5t4" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.183567 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" event={"ID":"69124a03-2f7c-4852-8c82-610b34bf8ab7","Type":"ContainerStarted","Data":"4b8baecead8097e0ecaee4ad24245bbe49c8fe3dad096a56cfc12db6e11d6615"} Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.185528 4580 generic.go:334] "Generic (PLEG): container finished" podID="c002830b-7ac1-4912-9b31-bad37ac63104" containerID="3e3b9d28a7226fe74b412c9b02ab9c697c7f56a2a79a0d821d7daeefce05559e" exitCode=0 Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.185575 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ppjj" event={"ID":"c002830b-7ac1-4912-9b31-bad37ac63104","Type":"ContainerDied","Data":"3e3b9d28a7226fe74b412c9b02ab9c697c7f56a2a79a0d821d7daeefce05559e"} Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.185596 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ppjj" event={"ID":"c002830b-7ac1-4912-9b31-bad37ac63104","Type":"ContainerDied","Data":"6a86b52069b079e0be51527b39057dc1b5d36e6ee6dfbdbb3dda3c50b9ab1160"} Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.185672 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6ppjj" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.193460 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" event={"ID":"57c3e0b1-582c-402d-a9b6-11eb69505341","Type":"ContainerStarted","Data":"4aa20885217c7bf8922f291df05d8d04db6604683805aa9f19e200e9e1cd3954"} Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.201198 4580 scope.go:117] "RemoveContainer" containerID="8b0aa4352205385a7c96780532b569d97e9b34b41f2b2a81e9907a25bd3dda4a" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.211486 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dd3cd12-741f-4993-8b39-994545e15c2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1dd3cd12-741f-4993-8b39-994545e15c2c" (UID: "1dd3cd12-741f-4993-8b39-994545e15c2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.238600 4580 scope.go:117] "RemoveContainer" containerID="789d58bcf3d2d044bfa0970062663e0129d03a5957f942a94f9f6b8687441a4e" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.255623 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dd3cd12-741f-4993-8b39-994545e15c2c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.255671 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dd3cd12-741f-4993-8b39-994545e15c2c-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.255684 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlp7m\" (UniqueName: \"kubernetes.io/projected/1dd3cd12-741f-4993-8b39-994545e15c2c-kube-api-access-tlp7m\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.256029 4580 scope.go:117] "RemoveContainer" containerID="de9b5104d4f276a0bd591e7730a9c3faede1841f2ffa9e8303eed7e6a3b21f59" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.256202 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6ppjj"] Mar 21 04:57:25 crc kubenswrapper[4580]: E0321 04:57:25.256919 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de9b5104d4f276a0bd591e7730a9c3faede1841f2ffa9e8303eed7e6a3b21f59\": container with ID starting with de9b5104d4f276a0bd591e7730a9c3faede1841f2ffa9e8303eed7e6a3b21f59 not found: ID does not exist" containerID="de9b5104d4f276a0bd591e7730a9c3faede1841f2ffa9e8303eed7e6a3b21f59" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.257918 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de9b5104d4f276a0bd591e7730a9c3faede1841f2ffa9e8303eed7e6a3b21f59"} err="failed to get container status \"de9b5104d4f276a0bd591e7730a9c3faede1841f2ffa9e8303eed7e6a3b21f59\": rpc error: code = NotFound desc = could not find container \"de9b5104d4f276a0bd591e7730a9c3faede1841f2ffa9e8303eed7e6a3b21f59\": container with ID starting with de9b5104d4f276a0bd591e7730a9c3faede1841f2ffa9e8303eed7e6a3b21f59 not found: ID does not exist" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.258074 4580 scope.go:117] "RemoveContainer" containerID="8b0aa4352205385a7c96780532b569d97e9b34b41f2b2a81e9907a25bd3dda4a" Mar 21 04:57:25 crc kubenswrapper[4580]: E0321 04:57:25.258410 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b0aa4352205385a7c96780532b569d97e9b34b41f2b2a81e9907a25bd3dda4a\": container with ID starting with 8b0aa4352205385a7c96780532b569d97e9b34b41f2b2a81e9907a25bd3dda4a not found: ID does not exist" containerID="8b0aa4352205385a7c96780532b569d97e9b34b41f2b2a81e9907a25bd3dda4a" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.258437 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b0aa4352205385a7c96780532b569d97e9b34b41f2b2a81e9907a25bd3dda4a"} err="failed to get container status \"8b0aa4352205385a7c96780532b569d97e9b34b41f2b2a81e9907a25bd3dda4a\": rpc error: code = NotFound desc = could not find container \"8b0aa4352205385a7c96780532b569d97e9b34b41f2b2a81e9907a25bd3dda4a\": container with ID starting with 8b0aa4352205385a7c96780532b569d97e9b34b41f2b2a81e9907a25bd3dda4a not found: ID does not exist" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.258450 4580 scope.go:117] "RemoveContainer" containerID="789d58bcf3d2d044bfa0970062663e0129d03a5957f942a94f9f6b8687441a4e" Mar 21 04:57:25 crc kubenswrapper[4580]: E0321 04:57:25.258729 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"789d58bcf3d2d044bfa0970062663e0129d03a5957f942a94f9f6b8687441a4e\": container with ID starting with 789d58bcf3d2d044bfa0970062663e0129d03a5957f942a94f9f6b8687441a4e not found: ID does not exist" containerID="789d58bcf3d2d044bfa0970062663e0129d03a5957f942a94f9f6b8687441a4e" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.258755 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"789d58bcf3d2d044bfa0970062663e0129d03a5957f942a94f9f6b8687441a4e"} err="failed to get container status \"789d58bcf3d2d044bfa0970062663e0129d03a5957f942a94f9f6b8687441a4e\": rpc error: code = NotFound desc = could not find container \"789d58bcf3d2d044bfa0970062663e0129d03a5957f942a94f9f6b8687441a4e\": container with ID starting with 789d58bcf3d2d044bfa0970062663e0129d03a5957f942a94f9f6b8687441a4e not found: ID does not exist" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.258769 4580 scope.go:117] "RemoveContainer" containerID="3e3b9d28a7226fe74b412c9b02ab9c697c7f56a2a79a0d821d7daeefce05559e" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.264290 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6ppjj"] Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.277277 4580 scope.go:117] "RemoveContainer" containerID="e9b886a4941199d457f32c72a0d33cc8b964cf3a922eb434b07fa8d2c8e5f014" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.306034 4580 scope.go:117] "RemoveContainer" containerID="7677ebf5ff4007868f4233ce112433a7606c14c1631642fc51ddc6acdf743d4c" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.331354 4580 scope.go:117] "RemoveContainer" containerID="3e3b9d28a7226fe74b412c9b02ab9c697c7f56a2a79a0d821d7daeefce05559e" Mar 21 04:57:25 crc kubenswrapper[4580]: E0321 04:57:25.331919 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e3b9d28a7226fe74b412c9b02ab9c697c7f56a2a79a0d821d7daeefce05559e\": container with ID starting with 3e3b9d28a7226fe74b412c9b02ab9c697c7f56a2a79a0d821d7daeefce05559e not found: ID does not exist" containerID="3e3b9d28a7226fe74b412c9b02ab9c697c7f56a2a79a0d821d7daeefce05559e" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.331993 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e3b9d28a7226fe74b412c9b02ab9c697c7f56a2a79a0d821d7daeefce05559e"} err="failed to get container status \"3e3b9d28a7226fe74b412c9b02ab9c697c7f56a2a79a0d821d7daeefce05559e\": rpc error: code = NotFound desc = could not find container \"3e3b9d28a7226fe74b412c9b02ab9c697c7f56a2a79a0d821d7daeefce05559e\": container with ID starting with 3e3b9d28a7226fe74b412c9b02ab9c697c7f56a2a79a0d821d7daeefce05559e not found: ID does not exist" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.332030 4580 scope.go:117] "RemoveContainer" containerID="e9b886a4941199d457f32c72a0d33cc8b964cf3a922eb434b07fa8d2c8e5f014" Mar 21 04:57:25 crc kubenswrapper[4580]: E0321 04:57:25.332490 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9b886a4941199d457f32c72a0d33cc8b964cf3a922eb434b07fa8d2c8e5f014\": container with ID starting with e9b886a4941199d457f32c72a0d33cc8b964cf3a922eb434b07fa8d2c8e5f014 not found: ID does not exist" containerID="e9b886a4941199d457f32c72a0d33cc8b964cf3a922eb434b07fa8d2c8e5f014" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.332550 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b886a4941199d457f32c72a0d33cc8b964cf3a922eb434b07fa8d2c8e5f014"} err="failed to get container status \"e9b886a4941199d457f32c72a0d33cc8b964cf3a922eb434b07fa8d2c8e5f014\": rpc error: code = NotFound desc = could not find container \"e9b886a4941199d457f32c72a0d33cc8b964cf3a922eb434b07fa8d2c8e5f014\": container with ID starting with e9b886a4941199d457f32c72a0d33cc8b964cf3a922eb434b07fa8d2c8e5f014 not found: ID does not exist" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.332574 4580 scope.go:117] "RemoveContainer" containerID="7677ebf5ff4007868f4233ce112433a7606c14c1631642fc51ddc6acdf743d4c" Mar 21 04:57:25 crc kubenswrapper[4580]: E0321 04:57:25.333315 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7677ebf5ff4007868f4233ce112433a7606c14c1631642fc51ddc6acdf743d4c\": container with ID starting with 7677ebf5ff4007868f4233ce112433a7606c14c1631642fc51ddc6acdf743d4c not found: ID does not exist" containerID="7677ebf5ff4007868f4233ce112433a7606c14c1631642fc51ddc6acdf743d4c" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.333395 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7677ebf5ff4007868f4233ce112433a7606c14c1631642fc51ddc6acdf743d4c"} err="failed to get container status \"7677ebf5ff4007868f4233ce112433a7606c14c1631642fc51ddc6acdf743d4c\": rpc error: code = NotFound desc = could not find container \"7677ebf5ff4007868f4233ce112433a7606c14c1631642fc51ddc6acdf743d4c\": container with ID starting with 7677ebf5ff4007868f4233ce112433a7606c14c1631642fc51ddc6acdf743d4c not found: ID does not exist" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.509895 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jq5t4"] Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.515795 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jq5t4"] Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.625403 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d98e5e9-408e-4d05-807f-8a3993f0da4a" path="/var/lib/kubelet/pods/0d98e5e9-408e-4d05-807f-8a3993f0da4a/volumes" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.626303 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dd3cd12-741f-4993-8b39-994545e15c2c" path="/var/lib/kubelet/pods/1dd3cd12-741f-4993-8b39-994545e15c2c/volumes" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.627140 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c002830b-7ac1-4912-9b31-bad37ac63104" path="/var/lib/kubelet/pods/c002830b-7ac1-4912-9b31-bad37ac63104/volumes" Mar 21 04:57:25 crc kubenswrapper[4580]: I0321 04:57:25.628598 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef8cf0b4-6e6e-4c54-a9af-95b814a84112" path="/var/lib/kubelet/pods/ef8cf0b4-6e6e-4c54-a9af-95b814a84112/volumes" Mar 21 04:57:26 crc kubenswrapper[4580]: I0321 04:57:26.202737 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" event={"ID":"69124a03-2f7c-4852-8c82-610b34bf8ab7","Type":"ContainerStarted","Data":"12cfe0ed985fe1f91f78423e96ab9f506804bf28cea401892bb15b01b75951eb"} Mar 21 04:57:26 crc kubenswrapper[4580]: I0321 04:57:26.203875 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:26 crc kubenswrapper[4580]: I0321 04:57:26.207679 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" event={"ID":"57c3e0b1-582c-402d-a9b6-11eb69505341","Type":"ContainerStarted","Data":"27b51456731ed1f732c392f6578494d7ae592040e80f7ffd10d2ea037c68648d"} Mar 21 04:57:26 crc kubenswrapper[4580]: I0321 04:57:26.207981 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:26 crc kubenswrapper[4580]: I0321 04:57:26.212526 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" Mar 21 04:57:26 crc kubenswrapper[4580]: I0321 04:57:26.214021 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" Mar 21 04:57:26 crc kubenswrapper[4580]: I0321 04:57:26.230174 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-fcbd9cc77-9t7b5" podStartSLOduration=4.230150141 podStartE2EDuration="4.230150141s" podCreationTimestamp="2026-03-21 04:57:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:57:26.22652943 +0000 UTC m=+351.309113088" watchObservedRunningTime="2026-03-21 04:57:26.230150141 +0000 UTC m=+351.312733789" Mar 21 04:57:26 crc kubenswrapper[4580]: I0321 04:57:26.310797 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-bc795cf67-fn9lm" podStartSLOduration=4.310758665 podStartE2EDuration="4.310758665s" podCreationTimestamp="2026-03-21 04:57:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:57:26.273189734 +0000 UTC m=+351.355773372" watchObservedRunningTime="2026-03-21 04:57:26.310758665 +0000 UTC m=+351.393342303" Mar 21 04:57:26 crc kubenswrapper[4580]: I0321 04:57:26.856397 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-47qgx"] Mar 21 04:57:26 crc kubenswrapper[4580]: I0321 04:57:26.857104 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-47qgx" podUID="bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" containerName="registry-server" containerID="cri-o://fdcb21da2ace935c0abda79e4e71e649535f556182a6b1ea39e256d0ea16c547" gracePeriod=2 Mar 21 04:57:27 crc kubenswrapper[4580]: I0321 04:57:27.220752 4580 generic.go:334] "Generic (PLEG): container finished" podID="bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" containerID="fdcb21da2ace935c0abda79e4e71e649535f556182a6b1ea39e256d0ea16c547" exitCode=0 Mar 21 04:57:27 crc kubenswrapper[4580]: I0321 04:57:27.221444 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47qgx" event={"ID":"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d","Type":"ContainerDied","Data":"fdcb21da2ace935c0abda79e4e71e649535f556182a6b1ea39e256d0ea16c547"} Mar 21 04:57:27 crc kubenswrapper[4580]: I0321 04:57:27.302135 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:57:27 crc kubenswrapper[4580]: I0321 04:57:27.388941 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-catalog-content\") pod \"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d\" (UID: \"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d\") " Mar 21 04:57:27 crc kubenswrapper[4580]: I0321 04:57:27.389133 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dgk5\" (UniqueName: \"kubernetes.io/projected/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-kube-api-access-9dgk5\") pod \"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d\" (UID: \"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d\") " Mar 21 04:57:27 crc kubenswrapper[4580]: I0321 04:57:27.389200 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-utilities\") pod \"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d\" (UID: \"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d\") " Mar 21 04:57:27 crc kubenswrapper[4580]: I0321 04:57:27.390460 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-utilities" (OuterVolumeSpecName: "utilities") pod "bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" (UID: "bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:57:27 crc kubenswrapper[4580]: I0321 04:57:27.396597 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-kube-api-access-9dgk5" (OuterVolumeSpecName: "kube-api-access-9dgk5") pod "bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" (UID: "bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d"). InnerVolumeSpecName "kube-api-access-9dgk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:27 crc kubenswrapper[4580]: I0321 04:57:27.416330 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" (UID: "bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:57:27 crc kubenswrapper[4580]: I0321 04:57:27.490260 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dgk5\" (UniqueName: \"kubernetes.io/projected/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-kube-api-access-9dgk5\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:27 crc kubenswrapper[4580]: I0321 04:57:27.490298 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:27 crc kubenswrapper[4580]: I0321 04:57:27.490310 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:28 crc kubenswrapper[4580]: I0321 04:57:28.241679 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47qgx" event={"ID":"bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d","Type":"ContainerDied","Data":"ec8bbfbc91e4559923ec364b21f3d4ff175387916004e2566c82486ddde49cb1"} Mar 21 04:57:28 crc kubenswrapper[4580]: I0321 04:57:28.241819 4580 scope.go:117] "RemoveContainer" containerID="fdcb21da2ace935c0abda79e4e71e649535f556182a6b1ea39e256d0ea16c547" Mar 21 04:57:28 crc kubenswrapper[4580]: I0321 04:57:28.242090 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47qgx" Mar 21 04:57:28 crc kubenswrapper[4580]: I0321 04:57:28.273731 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-47qgx"] Mar 21 04:57:28 crc kubenswrapper[4580]: I0321 04:57:28.276364 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-47qgx"] Mar 21 04:57:28 crc kubenswrapper[4580]: I0321 04:57:28.281436 4580 scope.go:117] "RemoveContainer" containerID="eff74d951ae334154034f0b3dc78f9dfb1f403fefc9feef7a4114c863b320ae1" Mar 21 04:57:28 crc kubenswrapper[4580]: I0321 04:57:28.300425 4580 scope.go:117] "RemoveContainer" containerID="de51185f02be9d281bb28d1cbb8576d2fd4cd3d333b5c8fe87f90731f8049eb7" Mar 21 04:57:29 crc kubenswrapper[4580]: I0321 04:57:29.625559 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" path="/var/lib/kubelet/pods/bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d/volumes" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.836180 4580 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.836476 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd3cd12-741f-4993-8b39-994545e15c2c" containerName="extract-content" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.836493 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd3cd12-741f-4993-8b39-994545e15c2c" containerName="extract-content" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.836505 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" containerName="registry-server" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.836512 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" containerName="registry-server" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.836522 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd3cd12-741f-4993-8b39-994545e15c2c" containerName="extract-utilities" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.836530 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd3cd12-741f-4993-8b39-994545e15c2c" containerName="extract-utilities" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.836538 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c002830b-7ac1-4912-9b31-bad37ac63104" containerName="extract-content" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.836545 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c002830b-7ac1-4912-9b31-bad37ac63104" containerName="extract-content" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.836551 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c002830b-7ac1-4912-9b31-bad37ac63104" containerName="registry-server" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.836557 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c002830b-7ac1-4912-9b31-bad37ac63104" containerName="registry-server" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.836566 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c002830b-7ac1-4912-9b31-bad37ac63104" containerName="extract-utilities" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.836573 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c002830b-7ac1-4912-9b31-bad37ac63104" containerName="extract-utilities" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.836584 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd3cd12-741f-4993-8b39-994545e15c2c" containerName="registry-server" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.836590 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd3cd12-741f-4993-8b39-994545e15c2c" containerName="registry-server" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.836604 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" containerName="extract-content" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.836610 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" containerName="extract-content" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.836619 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" containerName="extract-utilities" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.836625 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" containerName="extract-utilities" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.836719 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb2bedd9-d485-4a54-bcb2-d084f0e9ef0d" containerName="registry-server" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.836754 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="c002830b-7ac1-4912-9b31-bad37ac63104" containerName="registry-server" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.836762 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd3cd12-741f-4993-8b39-994545e15c2c" containerName="registry-server" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.837096 4580 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.837236 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.837342 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882" gracePeriod=15 Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.837410 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50" gracePeriod=15 Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.837466 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b" gracePeriod=15 Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.837410 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84" gracePeriod=15 Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.837577 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930" gracePeriod=15 Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.838946 4580 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.839213 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839234 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.839245 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839254 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.839267 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839275 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.839287 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839295 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.839303 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839310 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.839322 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839329 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.839342 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839348 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.839360 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839367 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.839380 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839387 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839522 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839538 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839548 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839555 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839564 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839572 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839582 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839590 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.839722 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839730 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: E0321 04:57:30.839741 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839749 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.839887 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.840151 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.916409 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.939667 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.939713 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.939742 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.939804 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.939820 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.939838 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.940020 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:30 crc kubenswrapper[4580]: I0321 04:57:30.940096 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.040767 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.040833 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.040852 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.040882 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.040907 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.040924 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.040940 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.040964 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.041040 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.041079 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.041129 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.041162 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.041190 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.041215 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.041246 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.041277 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.198632 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:57:31 crc kubenswrapper[4580]: W0321 04:57:31.221905 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-4a95f244103005132d9385366bda8480eb905b78a2420b0e1e5b888c0f2fa482 WatchSource:0}: Error finding container 4a95f244103005132d9385366bda8480eb905b78a2420b0e1e5b888c0f2fa482: Status 404 returned error can't find the container with id 4a95f244103005132d9385366bda8480eb905b78a2420b0e1e5b888c0f2fa482 Mar 21 04:57:31 crc kubenswrapper[4580]: E0321 04:57:31.226116 4580 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ec2671548bc81 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:57:31.225062529 +0000 UTC m=+356.307646157,LastTimestamp:2026-03-21 04:57:31.225062529 +0000 UTC m=+356.307646157,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.275447 4580 generic.go:334] "Generic (PLEG): container finished" podID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" containerID="e6e08e5148277152c14110df862955833c2821828d38f35e5ee8f18c0b32279e" exitCode=0 Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.275544 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a5a57185-3ae7-49c8-bc2f-5a57f2be7429","Type":"ContainerDied","Data":"e6e08e5148277152c14110df862955833c2821828d38f35e5ee8f18c0b32279e"} Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.276277 4580 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.276611 4580 status_manager.go:851] "Failed to get status for pod" podUID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.276870 4580 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.277702 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"4a95f244103005132d9385366bda8480eb905b78a2420b0e1e5b888c0f2fa482"} Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.280097 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.281286 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.281914 4580 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84" exitCode=0 Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.282366 4580 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50" exitCode=0 Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.282380 4580 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930" exitCode=0 Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.282391 4580 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b" exitCode=2 Mar 21 04:57:31 crc kubenswrapper[4580]: I0321 04:57:31.283500 4580 scope.go:117] "RemoveContainer" containerID="593771cc7cae166c9170e2519ff6127940f2ade2e57dcccc72b5b4d1353ed9ba" Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.293731 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.297961 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"dcd87c17f1fa3eaa4e310eb4239b6fbb6de2c290248f4b58578f9348e8c29a9f"} Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.298447 4580 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.298939 4580 status_manager.go:851] "Failed to get status for pod" podUID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.635867 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.637259 4580 status_manager.go:851] "Failed to get status for pod" podUID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.637719 4580 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.765168 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-var-lock\") pod \"a5a57185-3ae7-49c8-bc2f-5a57f2be7429\" (UID: \"a5a57185-3ae7-49c8-bc2f-5a57f2be7429\") " Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.765300 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-kube-api-access\") pod \"a5a57185-3ae7-49c8-bc2f-5a57f2be7429\" (UID: \"a5a57185-3ae7-49c8-bc2f-5a57f2be7429\") " Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.765331 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-kubelet-dir\") pod \"a5a57185-3ae7-49c8-bc2f-5a57f2be7429\" (UID: \"a5a57185-3ae7-49c8-bc2f-5a57f2be7429\") " Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.765323 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-var-lock" (OuterVolumeSpecName: "var-lock") pod "a5a57185-3ae7-49c8-bc2f-5a57f2be7429" (UID: "a5a57185-3ae7-49c8-bc2f-5a57f2be7429"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.765479 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a5a57185-3ae7-49c8-bc2f-5a57f2be7429" (UID: "a5a57185-3ae7-49c8-bc2f-5a57f2be7429"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.765683 4580 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-var-lock\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.765702 4580 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.775212 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a5a57185-3ae7-49c8-bc2f-5a57f2be7429" (UID: "a5a57185-3ae7-49c8-bc2f-5a57f2be7429"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:32 crc kubenswrapper[4580]: I0321 04:57:32.867502 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5a57185-3ae7-49c8-bc2f-5a57f2be7429-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.305473 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a5a57185-3ae7-49c8-bc2f-5a57f2be7429","Type":"ContainerDied","Data":"57f6a5aff1e58ecf4c7792ff128663c538f41f7a4a175242168a06b4bfcc06e0"} Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.305882 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57f6a5aff1e58ecf4c7792ff128663c538f41f7a4a175242168a06b4bfcc06e0" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.305760 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.306108 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.309340 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.310149 4580 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.310383 4580 status_manager.go:851] "Failed to get status for pod" podUID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.310538 4580 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.329042 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.332438 4580 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.332835 4580 status_manager.go:851] "Failed to get status for pod" podUID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.333046 4580 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882" exitCode=0 Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.333188 4580 scope.go:117] "RemoveContainer" containerID="0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.333634 4580 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.352457 4580 scope.go:117] "RemoveContainer" containerID="480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.367020 4580 scope.go:117] "RemoveContainer" containerID="5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.373324 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.373469 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.373561 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.373752 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.373794 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.373801 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.373874 4580 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.373889 4580 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.373906 4580 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.382626 4580 scope.go:117] "RemoveContainer" containerID="499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.395914 4580 scope.go:117] "RemoveContainer" containerID="cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.418013 4580 scope.go:117] "RemoveContainer" containerID="545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.442059 4580 scope.go:117] "RemoveContainer" containerID="0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84" Mar 21 04:57:33 crc kubenswrapper[4580]: E0321 04:57:33.442871 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84\": container with ID starting with 0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84 not found: ID does not exist" containerID="0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.442910 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84"} err="failed to get container status \"0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84\": rpc error: code = NotFound desc = could not find container \"0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84\": container with ID starting with 0f22dac903a87b0062314e9b14ff43e00331b5dc04dce3efdd6d837f2580bb84 not found: ID does not exist" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.442941 4580 scope.go:117] "RemoveContainer" containerID="480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50" Mar 21 04:57:33 crc kubenswrapper[4580]: E0321 04:57:33.443274 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\": container with ID starting with 480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50 not found: ID does not exist" containerID="480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.443290 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50"} err="failed to get container status \"480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\": rpc error: code = NotFound desc = could not find container \"480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50\": container with ID starting with 480e5b311d0a142239cbcf047bc107822d6982369c95dc75cdfd3c98f8f03c50 not found: ID does not exist" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.443304 4580 scope.go:117] "RemoveContainer" containerID="5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930" Mar 21 04:57:33 crc kubenswrapper[4580]: E0321 04:57:33.443617 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\": container with ID starting with 5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930 not found: ID does not exist" containerID="5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.443634 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930"} err="failed to get container status \"5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\": rpc error: code = NotFound desc = could not find container \"5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930\": container with ID starting with 5a47b507ccdb5601d95a2611540b2f13e64fbcc079c7a2dd01c919cb811e8930 not found: ID does not exist" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.443649 4580 scope.go:117] "RemoveContainer" containerID="499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b" Mar 21 04:57:33 crc kubenswrapper[4580]: E0321 04:57:33.443988 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\": container with ID starting with 499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b not found: ID does not exist" containerID="499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.444017 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b"} err="failed to get container status \"499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\": rpc error: code = NotFound desc = could not find container \"499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b\": container with ID starting with 499681173b3b4a3056e2fdeec697bfba9c4693ff326f8a6e068bdfa216803d8b not found: ID does not exist" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.444042 4580 scope.go:117] "RemoveContainer" containerID="cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882" Mar 21 04:57:33 crc kubenswrapper[4580]: E0321 04:57:33.444376 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\": container with ID starting with cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882 not found: ID does not exist" containerID="cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.444417 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882"} err="failed to get container status \"cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\": rpc error: code = NotFound desc = could not find container \"cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882\": container with ID starting with cd2fb5ef7deb75afdc320b98e233fc0f0bdaecf5c7bb9e86c89af2a9ef563882 not found: ID does not exist" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.444446 4580 scope.go:117] "RemoveContainer" containerID="545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828" Mar 21 04:57:33 crc kubenswrapper[4580]: E0321 04:57:33.444825 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\": container with ID starting with 545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828 not found: ID does not exist" containerID="545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.444868 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828"} err="failed to get container status \"545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\": rpc error: code = NotFound desc = could not find container \"545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828\": container with ID starting with 545c2babec81ea5b76ea83e0b44a0dd31be1ea14888b0c8ed767c6a71608b828 not found: ID does not exist" Mar 21 04:57:33 crc kubenswrapper[4580]: I0321 04:57:33.629858 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.038375 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.038865 4580 status_manager.go:851] "Failed to get status for pod" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" pod="openshift-marketplace/redhat-operators-kk5nv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kk5nv\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.039105 4580 status_manager.go:851] "Failed to get status for pod" podUID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.039285 4580 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.080169 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.081233 4580 status_manager.go:851] "Failed to get status for pod" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" pod="openshift-marketplace/redhat-operators-kk5nv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kk5nv\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.081543 4580 status_manager.go:851] "Failed to get status for pod" podUID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.081898 4580 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.345310 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.346230 4580 status_manager.go:851] "Failed to get status for pod" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" pod="openshift-marketplace/redhat-operators-kk5nv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kk5nv\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.346447 4580 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.346661 4580 status_manager.go:851] "Failed to get status for pod" podUID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.346953 4580 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.351940 4580 status_manager.go:851] "Failed to get status for pod" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" pod="openshift-marketplace/redhat-operators-kk5nv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kk5nv\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.352127 4580 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.352699 4580 status_manager.go:851] "Failed to get status for pod" podUID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:34 crc kubenswrapper[4580]: I0321 04:57:34.353020 4580 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:35 crc kubenswrapper[4580]: I0321 04:57:35.621952 4580 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:35 crc kubenswrapper[4580]: I0321 04:57:35.622526 4580 status_manager.go:851] "Failed to get status for pod" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" pod="openshift-marketplace/redhat-operators-kk5nv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kk5nv\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:35 crc kubenswrapper[4580]: I0321 04:57:35.623047 4580 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:35 crc kubenswrapper[4580]: I0321 04:57:35.623276 4580 status_manager.go:851] "Failed to get status for pod" podUID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:36 crc kubenswrapper[4580]: E0321 04:57:36.632716 4580 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ec2671548bc81 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:57:31.225062529 +0000 UTC m=+356.307646157,LastTimestamp:2026-03-21 04:57:31.225062529 +0000 UTC m=+356.307646157,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:57:38 crc kubenswrapper[4580]: E0321 04:57:38.714678 4580 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" volumeName="registry-storage" Mar 21 04:57:39 crc kubenswrapper[4580]: E0321 04:57:39.461814 4580 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:39 crc kubenswrapper[4580]: E0321 04:57:39.462775 4580 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:39 crc kubenswrapper[4580]: E0321 04:57:39.463144 4580 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:39 crc kubenswrapper[4580]: E0321 04:57:39.463376 4580 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:39 crc kubenswrapper[4580]: E0321 04:57:39.463675 4580 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:39 crc kubenswrapper[4580]: I0321 04:57:39.463714 4580 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 21 04:57:39 crc kubenswrapper[4580]: E0321 04:57:39.463985 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="200ms" Mar 21 04:57:39 crc kubenswrapper[4580]: E0321 04:57:39.665307 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="400ms" Mar 21 04:57:39 crc kubenswrapper[4580]: E0321 04:57:39.864905 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:57:39Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:57:39Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:57:39Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:57:39Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:39 crc kubenswrapper[4580]: E0321 04:57:39.865156 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:39 crc kubenswrapper[4580]: E0321 04:57:39.865366 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:39 crc kubenswrapper[4580]: E0321 04:57:39.865573 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:39 crc kubenswrapper[4580]: E0321 04:57:39.866052 4580 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:39 crc kubenswrapper[4580]: E0321 04:57:39.866102 4580 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:57:40 crc kubenswrapper[4580]: E0321 04:57:40.066561 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="800ms" Mar 21 04:57:40 crc kubenswrapper[4580]: E0321 04:57:40.873174 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="1.6s" Mar 21 04:57:42 crc kubenswrapper[4580]: E0321 04:57:42.475081 4580 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="3.2s" Mar 21 04:57:42 crc kubenswrapper[4580]: I0321 04:57:42.617897 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:42 crc kubenswrapper[4580]: I0321 04:57:42.618829 4580 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:42 crc kubenswrapper[4580]: I0321 04:57:42.619384 4580 status_manager.go:851] "Failed to get status for pod" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" pod="openshift-marketplace/redhat-operators-kk5nv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kk5nv\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:42 crc kubenswrapper[4580]: I0321 04:57:42.619827 4580 status_manager.go:851] "Failed to get status for pod" podUID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:42 crc kubenswrapper[4580]: I0321 04:57:42.634017 4580 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="390e4700-584c-4822-a638-08a1e97f37e8" Mar 21 04:57:42 crc kubenswrapper[4580]: I0321 04:57:42.634054 4580 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="390e4700-584c-4822-a638-08a1e97f37e8" Mar 21 04:57:42 crc kubenswrapper[4580]: E0321 04:57:42.634480 4580 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:42 crc kubenswrapper[4580]: I0321 04:57:42.635041 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:43 crc kubenswrapper[4580]: I0321 04:57:43.404409 4580 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="c81bf3cd8e9cc7e8290c37d5a100b22db6bacd15565a94cd1cca09e33eb46799" exitCode=0 Mar 21 04:57:43 crc kubenswrapper[4580]: I0321 04:57:43.404641 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"c81bf3cd8e9cc7e8290c37d5a100b22db6bacd15565a94cd1cca09e33eb46799"} Mar 21 04:57:43 crc kubenswrapper[4580]: I0321 04:57:43.405980 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"75669ae827250f0b8f71b30eb24109f6838a767a13377e5900d5bc73ab41c7a6"} Mar 21 04:57:43 crc kubenswrapper[4580]: I0321 04:57:43.406471 4580 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="390e4700-584c-4822-a638-08a1e97f37e8" Mar 21 04:57:43 crc kubenswrapper[4580]: I0321 04:57:43.406503 4580 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="390e4700-584c-4822-a638-08a1e97f37e8" Mar 21 04:57:43 crc kubenswrapper[4580]: E0321 04:57:43.407062 4580 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:43 crc kubenswrapper[4580]: I0321 04:57:43.407123 4580 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:43 crc kubenswrapper[4580]: I0321 04:57:43.407459 4580 status_manager.go:851] "Failed to get status for pod" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" pod="openshift-marketplace/redhat-operators-kk5nv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kk5nv\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:43 crc kubenswrapper[4580]: I0321 04:57:43.407895 4580 status_manager.go:851] "Failed to get status for pod" podUID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 21 04:57:44 crc kubenswrapper[4580]: I0321 04:57:44.421555 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0318423e0c0075e647d898d4d743e26a20b3753ebec4b0acca65072f23a54ba4"} Mar 21 04:57:44 crc kubenswrapper[4580]: I0321 04:57:44.422076 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"305716b775cbde7a6f2cbe7b196089a10ed89f619b3e8a93b4af8b9e54ca4bb5"} Mar 21 04:57:44 crc kubenswrapper[4580]: I0321 04:57:44.422099 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"facb678388583ae3ef414acb923a9e7c4929ff02f93dcfb1ad4db274ff411368"} Mar 21 04:57:45 crc kubenswrapper[4580]: I0321 04:57:45.433869 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 04:57:45 crc kubenswrapper[4580]: I0321 04:57:45.434921 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 21 04:57:45 crc kubenswrapper[4580]: I0321 04:57:45.434967 4580 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e" exitCode=1 Mar 21 04:57:45 crc kubenswrapper[4580]: I0321 04:57:45.435037 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e"} Mar 21 04:57:45 crc kubenswrapper[4580]: I0321 04:57:45.435607 4580 scope.go:117] "RemoveContainer" containerID="1c1e89eaf8678e49d5b67479beeb69ed743b83d240d945b41ea504e2bb8d613e" Mar 21 04:57:45 crc kubenswrapper[4580]: I0321 04:57:45.438628 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7bec12e480fc2582e20fe004efc50da03ae0f9d47e89db28fcc55cbed9c74b92"} Mar 21 04:57:45 crc kubenswrapper[4580]: I0321 04:57:45.438666 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4d03db31ea4d357e6d155d87af050eb2132b0dbab064954f5a4067d57f110866"} Mar 21 04:57:45 crc kubenswrapper[4580]: I0321 04:57:45.438801 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:45 crc kubenswrapper[4580]: I0321 04:57:45.438912 4580 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="390e4700-584c-4822-a638-08a1e97f37e8" Mar 21 04:57:45 crc kubenswrapper[4580]: I0321 04:57:45.438940 4580 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="390e4700-584c-4822-a638-08a1e97f37e8" Mar 21 04:57:46 crc kubenswrapper[4580]: I0321 04:57:46.451020 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 04:57:46 crc kubenswrapper[4580]: I0321 04:57:46.452422 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 21 04:57:46 crc kubenswrapper[4580]: I0321 04:57:46.452497 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4c84db2bb66fad89e607137cc193e71554aff1a13f0d30065c4095fc1f52cf47"} Mar 21 04:57:46 crc kubenswrapper[4580]: I0321 04:57:46.971835 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:57:46 crc kubenswrapper[4580]: I0321 04:57:46.976496 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:57:47 crc kubenswrapper[4580]: I0321 04:57:47.458347 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:57:47 crc kubenswrapper[4580]: I0321 04:57:47.636034 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:47 crc kubenswrapper[4580]: I0321 04:57:47.636118 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:47 crc kubenswrapper[4580]: I0321 04:57:47.643236 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:50 crc kubenswrapper[4580]: I0321 04:57:50.457534 4580 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:50 crc kubenswrapper[4580]: I0321 04:57:50.598803 4580 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="93938219-11c0-46d0-a0a5-57c8349557b6" Mar 21 04:57:51 crc kubenswrapper[4580]: I0321 04:57:51.483002 4580 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="390e4700-584c-4822-a638-08a1e97f37e8" Mar 21 04:57:51 crc kubenswrapper[4580]: I0321 04:57:51.483029 4580 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="390e4700-584c-4822-a638-08a1e97f37e8" Mar 21 04:57:51 crc kubenswrapper[4580]: I0321 04:57:51.486415 4580 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="93938219-11c0-46d0-a0a5-57c8349557b6" Mar 21 04:57:51 crc kubenswrapper[4580]: I0321 04:57:51.488140 4580 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://facb678388583ae3ef414acb923a9e7c4929ff02f93dcfb1ad4db274ff411368" Mar 21 04:57:51 crc kubenswrapper[4580]: I0321 04:57:51.488163 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:57:52 crc kubenswrapper[4580]: I0321 04:57:52.499709 4580 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="390e4700-584c-4822-a638-08a1e97f37e8" Mar 21 04:57:52 crc kubenswrapper[4580]: I0321 04:57:52.499843 4580 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="390e4700-584c-4822-a638-08a1e97f37e8" Mar 21 04:57:52 crc kubenswrapper[4580]: I0321 04:57:52.503507 4580 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="93938219-11c0-46d0-a0a5-57c8349557b6" Mar 21 04:57:59 crc kubenswrapper[4580]: I0321 04:57:59.768012 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:58:00 crc kubenswrapper[4580]: I0321 04:58:00.429286 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 21 04:58:00 crc kubenswrapper[4580]: I0321 04:58:00.804335 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 21 04:58:00 crc kubenswrapper[4580]: I0321 04:58:00.877294 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 21 04:58:01 crc kubenswrapper[4580]: I0321 04:58:01.629112 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 21 04:58:02 crc kubenswrapper[4580]: I0321 04:58:02.052072 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 21 04:58:02 crc kubenswrapper[4580]: I0321 04:58:02.101127 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:58:02 crc kubenswrapper[4580]: I0321 04:58:02.184892 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 21 04:58:02 crc kubenswrapper[4580]: I0321 04:58:02.271096 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 21 04:58:02 crc kubenswrapper[4580]: I0321 04:58:02.358017 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 21 04:58:02 crc kubenswrapper[4580]: I0321 04:58:02.562838 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 21 04:58:02 crc kubenswrapper[4580]: I0321 04:58:02.638269 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 21 04:58:02 crc kubenswrapper[4580]: I0321 04:58:02.649226 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:58:02 crc kubenswrapper[4580]: I0321 04:58:02.863764 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 21 04:58:03 crc kubenswrapper[4580]: I0321 04:58:03.111167 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 21 04:58:03 crc kubenswrapper[4580]: I0321 04:58:03.439556 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 21 04:58:03 crc kubenswrapper[4580]: I0321 04:58:03.725952 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 21 04:58:03 crc kubenswrapper[4580]: I0321 04:58:03.730448 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 21 04:58:03 crc kubenswrapper[4580]: I0321 04:58:03.836342 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 21 04:58:03 crc kubenswrapper[4580]: I0321 04:58:03.917043 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 21 04:58:03 crc kubenswrapper[4580]: I0321 04:58:03.992752 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:58:04 crc kubenswrapper[4580]: I0321 04:58:04.077965 4580 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 21 04:58:04 crc kubenswrapper[4580]: I0321 04:58:04.229544 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 21 04:58:04 crc kubenswrapper[4580]: I0321 04:58:04.561530 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 21 04:58:04 crc kubenswrapper[4580]: I0321 04:58:04.629325 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:58:04 crc kubenswrapper[4580]: I0321 04:58:04.775217 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 21 04:58:04 crc kubenswrapper[4580]: I0321 04:58:04.822967 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 21 04:58:04 crc kubenswrapper[4580]: I0321 04:58:04.887389 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 21 04:58:04 crc kubenswrapper[4580]: I0321 04:58:04.931089 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 21 04:58:04 crc kubenswrapper[4580]: I0321 04:58:04.956890 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.006720 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.047895 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.053836 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.054404 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.116028 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.264302 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.289819 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.337839 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.351510 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.390162 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.545178 4580 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.569060 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.577013 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.715666 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.780340 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.781465 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.784686 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.824688 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.842650 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.966700 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.974757 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 21 04:58:05 crc kubenswrapper[4580]: I0321 04:58:05.975218 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.060312 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.246739 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.275573 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.306105 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.310863 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.327020 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.349046 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.376315 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.420025 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.541977 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.592054 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.648383 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.828551 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.831749 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.834751 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 21 04:58:06 crc kubenswrapper[4580]: I0321 04:58:06.889698 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.053661 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.067449 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.114308 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.120189 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.131888 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.232964 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.250837 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.263417 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.330880 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.330974 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.388674 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.536978 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.582164 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.690922 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.716693 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.786224 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.953703 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.966107 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 21 04:58:07 crc kubenswrapper[4580]: I0321 04:58:07.983706 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 21 04:58:08 crc kubenswrapper[4580]: I0321 04:58:08.097446 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 21 04:58:08 crc kubenswrapper[4580]: I0321 04:58:08.132834 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 21 04:58:08 crc kubenswrapper[4580]: I0321 04:58:08.225669 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 21 04:58:08 crc kubenswrapper[4580]: I0321 04:58:08.519302 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 21 04:58:08 crc kubenswrapper[4580]: I0321 04:58:08.595885 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 21 04:58:08 crc kubenswrapper[4580]: I0321 04:58:08.636932 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:58:08 crc kubenswrapper[4580]: I0321 04:58:08.650450 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 21 04:58:08 crc kubenswrapper[4580]: I0321 04:58:08.723171 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 21 04:58:08 crc kubenswrapper[4580]: I0321 04:58:08.768199 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 21 04:58:08 crc kubenswrapper[4580]: I0321 04:58:08.818827 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 21 04:58:08 crc kubenswrapper[4580]: I0321 04:58:08.919594 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 21 04:58:08 crc kubenswrapper[4580]: I0321 04:58:08.943908 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:58:08 crc kubenswrapper[4580]: I0321 04:58:08.945014 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 21 04:58:08 crc kubenswrapper[4580]: I0321 04:58:08.983617 4580 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.079295 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.152308 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.222334 4580 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.224717 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.226324 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=39.226306098 podStartE2EDuration="39.226306098s" podCreationTimestamp="2026-03-21 04:57:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:57:50.531941683 +0000 UTC m=+375.614525311" watchObservedRunningTime="2026-03-21 04:58:09.226306098 +0000 UTC m=+394.308889726" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.226756 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.226822 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.236161 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.273432 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.279603 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.279584203 podStartE2EDuration="19.279584203s" podCreationTimestamp="2026-03-21 04:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:58:09.252638372 +0000 UTC m=+394.335222020" watchObservedRunningTime="2026-03-21 04:58:09.279584203 +0000 UTC m=+394.362167831" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.313178 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.411267 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.447835 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.474515 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.477379 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.504532 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.548608 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.591041 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.593018 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.606739 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.615578 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.660850 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.662482 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.739740 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.777722 4580 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.827295 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.832047 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.892820 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.899180 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.939637 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 21 04:58:09 crc kubenswrapper[4580]: I0321 04:58:09.972803 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.016978 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.039249 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.079971 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.083757 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.167699 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.200586 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.237180 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.273158 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.314338 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.319450 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.340217 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.345575 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.354668 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.431984 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.444180 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.591317 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.604277 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.636073 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.721762 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.765512 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.795477 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 21 04:58:10 crc kubenswrapper[4580]: I0321 04:58:10.929966 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.038656 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.073189 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.117601 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.145363 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.168830 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.346416 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.350873 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.483241 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.513669 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.528821 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.550010 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.553600 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.593952 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.645043 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.646738 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.651225 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.684277 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.730846 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.840003 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.889853 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.893426 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.910125 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 21 04:58:11 crc kubenswrapper[4580]: I0321 04:58:11.926148 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.243153 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567818-rvw2x"] Mar 21 04:58:12 crc kubenswrapper[4580]: E0321 04:58:12.243585 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" containerName="installer" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.243608 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" containerName="installer" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.243744 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5a57185-3ae7-49c8-bc2f-5a57f2be7429" containerName="installer" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.244296 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-rvw2x" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.246662 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.247234 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.247524 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.247559 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.248132 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.251115 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.293624 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.313945 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.329712 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.381331 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.384640 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-rvw2x"] Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.442248 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj5md\" (UniqueName: \"kubernetes.io/projected/7d3189c3-2741-4a7b-9307-2368ec483cf9-kube-api-access-qj5md\") pod \"auto-csr-approver-29567818-rvw2x\" (UID: \"7d3189c3-2741-4a7b-9307-2368ec483cf9\") " pod="openshift-infra/auto-csr-approver-29567818-rvw2x" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.543496 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj5md\" (UniqueName: \"kubernetes.io/projected/7d3189c3-2741-4a7b-9307-2368ec483cf9-kube-api-access-qj5md\") pod \"auto-csr-approver-29567818-rvw2x\" (UID: \"7d3189c3-2741-4a7b-9307-2368ec483cf9\") " pod="openshift-infra/auto-csr-approver-29567818-rvw2x" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.564074 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj5md\" (UniqueName: \"kubernetes.io/projected/7d3189c3-2741-4a7b-9307-2368ec483cf9-kube-api-access-qj5md\") pod \"auto-csr-approver-29567818-rvw2x\" (UID: \"7d3189c3-2741-4a7b-9307-2368ec483cf9\") " pod="openshift-infra/auto-csr-approver-29567818-rvw2x" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.569155 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-rvw2x" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.577330 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.645475 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.672499 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.683168 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.758632 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.760289 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.797147 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.825849 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.833674 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 21 04:58:12 crc kubenswrapper[4580]: I0321 04:58:12.985468 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-rvw2x"] Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.043007 4580 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.043248 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://dcd87c17f1fa3eaa4e310eb4239b6fbb6de2c290248f4b58578f9348e8c29a9f" gracePeriod=5 Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.064239 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.130670 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.159608 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.317672 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.319154 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.344702 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.428435 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.551637 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.587280 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.623656 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.627136 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.647377 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567818-rvw2x" event={"ID":"7d3189c3-2741-4a7b-9307-2368ec483cf9","Type":"ContainerStarted","Data":"2d17425caadbf08bf0d36ff336ba52a71e6dbb927097ff090259d55139018c27"} Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.742001 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.762284 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.779350 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.796199 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.843954 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.930759 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 21 04:58:13 crc kubenswrapper[4580]: I0321 04:58:13.951755 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.034969 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.209477 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.243963 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.392169 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.405897 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.563339 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.574384 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.615356 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.616684 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.657057 4580 generic.go:334] "Generic (PLEG): container finished" podID="7d3189c3-2741-4a7b-9307-2368ec483cf9" containerID="92a4c095e7a4ade2273e23458e8688196a1c5116405e14261dc8ec3c9af2d3a2" exitCode=0 Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.657102 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567818-rvw2x" event={"ID":"7d3189c3-2741-4a7b-9307-2368ec483cf9","Type":"ContainerDied","Data":"92a4c095e7a4ade2273e23458e8688196a1c5116405e14261dc8ec3c9af2d3a2"} Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.659507 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.693751 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.859424 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.945070 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 21 04:58:14 crc kubenswrapper[4580]: I0321 04:58:14.970437 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.001950 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.103125 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.253937 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.273822 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.515613 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.545731 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.571461 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.587893 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.589305 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.645879 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.685920 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.694479 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.702439 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.786440 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.914594 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.994551 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 21 04:58:15 crc kubenswrapper[4580]: I0321 04:58:15.998442 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 21 04:58:16 crc kubenswrapper[4580]: I0321 04:58:16.002973 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:58:16 crc kubenswrapper[4580]: I0321 04:58:16.017445 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-rvw2x" Mar 21 04:58:16 crc kubenswrapper[4580]: I0321 04:58:16.107570 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj5md\" (UniqueName: \"kubernetes.io/projected/7d3189c3-2741-4a7b-9307-2368ec483cf9-kube-api-access-qj5md\") pod \"7d3189c3-2741-4a7b-9307-2368ec483cf9\" (UID: \"7d3189c3-2741-4a7b-9307-2368ec483cf9\") " Mar 21 04:58:16 crc kubenswrapper[4580]: I0321 04:58:16.117014 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d3189c3-2741-4a7b-9307-2368ec483cf9-kube-api-access-qj5md" (OuterVolumeSpecName: "kube-api-access-qj5md") pod "7d3189c3-2741-4a7b-9307-2368ec483cf9" (UID: "7d3189c3-2741-4a7b-9307-2368ec483cf9"). InnerVolumeSpecName "kube-api-access-qj5md". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:58:16 crc kubenswrapper[4580]: I0321 04:58:16.117348 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 21 04:58:16 crc kubenswrapper[4580]: I0321 04:58:16.211549 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj5md\" (UniqueName: \"kubernetes.io/projected/7d3189c3-2741-4a7b-9307-2368ec483cf9-kube-api-access-qj5md\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:16 crc kubenswrapper[4580]: I0321 04:58:16.362299 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 21 04:58:16 crc kubenswrapper[4580]: I0321 04:58:16.483652 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 21 04:58:16 crc kubenswrapper[4580]: I0321 04:58:16.488554 4580 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 21 04:58:16 crc kubenswrapper[4580]: I0321 04:58:16.624425 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 21 04:58:16 crc kubenswrapper[4580]: I0321 04:58:16.664716 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 21 04:58:16 crc kubenswrapper[4580]: I0321 04:58:16.673208 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567818-rvw2x" event={"ID":"7d3189c3-2741-4a7b-9307-2368ec483cf9","Type":"ContainerDied","Data":"2d17425caadbf08bf0d36ff336ba52a71e6dbb927097ff090259d55139018c27"} Mar 21 04:58:16 crc kubenswrapper[4580]: I0321 04:58:16.673481 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d17425caadbf08bf0d36ff336ba52a71e6dbb927097ff090259d55139018c27" Mar 21 04:58:16 crc kubenswrapper[4580]: I0321 04:58:16.673500 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-rvw2x" Mar 21 04:58:16 crc kubenswrapper[4580]: I0321 04:58:16.930330 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.014126 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.030193 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.321442 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n99sq"] Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.321983 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n99sq" podUID="82874992-faa8-4c73-955b-ffe5f02726a7" containerName="registry-server" containerID="cri-o://92c7faaec51f14fd1e7fdcaf810c546556533e202c39331b82a5f2bfa03c49b4" gracePeriod=30 Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.339418 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-77nmx"] Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.339857 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-77nmx" podUID="37b3e873-7ca5-4413-9998-6aaf824d6cd7" containerName="registry-server" containerID="cri-o://0e74c3fade6a8363e1e513360c03f6b271c3fe659dcf4cd9684fc5169792e54d" gracePeriod=30 Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.354115 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hnl8s"] Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.354657 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" podUID="8d1b089c-8016-458b-83b5-84f602ea0ba7" containerName="marketplace-operator" containerID="cri-o://f330ce73140deb65d3b48634f1b3afe76b09042aa1b936265d66878f92f36b12" gracePeriod=30 Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.370266 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4kpsb"] Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.370704 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4kpsb" podUID="484933df-fe17-42ec-99da-d1187d674051" containerName="registry-server" containerID="cri-o://fc980afd46cf9d25cf756a6876af0ccecbed294fda22f489208963d9bc262001" gracePeriod=30 Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.381080 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kk5nv"] Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.381633 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kk5nv" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" containerName="registry-server" containerID="cri-o://8b056b7426c8b8da087bba83b7bdcf3efa8446cc8e7a0d2e9bb1c7e7ab8b7ef6" gracePeriod=30 Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.387259 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ns8gg"] Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.387592 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ns8gg" podUID="bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" containerName="registry-server" containerID="cri-o://2906bf6ad16aee4244bf07292c53b40e9340abb3618410bf10d6664a6ee01b68" gracePeriod=30 Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.415238 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p95jn"] Mar 21 04:58:17 crc kubenswrapper[4580]: E0321 04:58:17.415797 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3189c3-2741-4a7b-9307-2368ec483cf9" containerName="oc" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.415918 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3189c3-2741-4a7b-9307-2368ec483cf9" containerName="oc" Mar 21 04:58:17 crc kubenswrapper[4580]: E0321 04:58:17.415995 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.416066 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.416301 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3189c3-2741-4a7b-9307-2368ec483cf9" containerName="oc" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.416445 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.417054 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.427545 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p95jn"] Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.435137 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ad5649a-1bef-41a6-aeaa-73f2850df16a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p95jn\" (UID: \"6ad5649a-1bef-41a6-aeaa-73f2850df16a\") " pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.435398 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9xxx\" (UniqueName: \"kubernetes.io/projected/6ad5649a-1bef-41a6-aeaa-73f2850df16a-kube-api-access-q9xxx\") pod \"marketplace-operator-79b997595-p95jn\" (UID: \"6ad5649a-1bef-41a6-aeaa-73f2850df16a\") " pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.435505 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ad5649a-1bef-41a6-aeaa-73f2850df16a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p95jn\" (UID: \"6ad5649a-1bef-41a6-aeaa-73f2850df16a\") " pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.502531 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.538753 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ad5649a-1bef-41a6-aeaa-73f2850df16a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p95jn\" (UID: \"6ad5649a-1bef-41a6-aeaa-73f2850df16a\") " pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.538837 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9xxx\" (UniqueName: \"kubernetes.io/projected/6ad5649a-1bef-41a6-aeaa-73f2850df16a-kube-api-access-q9xxx\") pod \"marketplace-operator-79b997595-p95jn\" (UID: \"6ad5649a-1bef-41a6-aeaa-73f2850df16a\") " pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.538902 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ad5649a-1bef-41a6-aeaa-73f2850df16a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p95jn\" (UID: \"6ad5649a-1bef-41a6-aeaa-73f2850df16a\") " pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.540814 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ad5649a-1bef-41a6-aeaa-73f2850df16a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p95jn\" (UID: \"6ad5649a-1bef-41a6-aeaa-73f2850df16a\") " pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.556479 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ad5649a-1bef-41a6-aeaa-73f2850df16a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p95jn\" (UID: \"6ad5649a-1bef-41a6-aeaa-73f2850df16a\") " pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.559976 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9xxx\" (UniqueName: \"kubernetes.io/projected/6ad5649a-1bef-41a6-aeaa-73f2850df16a-kube-api-access-q9xxx\") pod \"marketplace-operator-79b997595-p95jn\" (UID: \"6ad5649a-1bef-41a6-aeaa-73f2850df16a\") " pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.699140 4580 generic.go:334] "Generic (PLEG): container finished" podID="8d1b089c-8016-458b-83b5-84f602ea0ba7" containerID="f330ce73140deb65d3b48634f1b3afe76b09042aa1b936265d66878f92f36b12" exitCode=0 Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.699321 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" event={"ID":"8d1b089c-8016-458b-83b5-84f602ea0ba7","Type":"ContainerDied","Data":"f330ce73140deb65d3b48634f1b3afe76b09042aa1b936265d66878f92f36b12"} Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.713226 4580 generic.go:334] "Generic (PLEG): container finished" podID="bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" containerID="2906bf6ad16aee4244bf07292c53b40e9340abb3618410bf10d6664a6ee01b68" exitCode=0 Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.713283 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ns8gg" event={"ID":"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c","Type":"ContainerDied","Data":"2906bf6ad16aee4244bf07292c53b40e9340abb3618410bf10d6664a6ee01b68"} Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.716067 4580 generic.go:334] "Generic (PLEG): container finished" podID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" containerID="8b056b7426c8b8da087bba83b7bdcf3efa8446cc8e7a0d2e9bb1c7e7ab8b7ef6" exitCode=0 Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.716151 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kk5nv" event={"ID":"9940b0fa-e788-4da2-af4f-da4cdc60f12d","Type":"ContainerDied","Data":"8b056b7426c8b8da087bba83b7bdcf3efa8446cc8e7a0d2e9bb1c7e7ab8b7ef6"} Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.718979 4580 generic.go:334] "Generic (PLEG): container finished" podID="484933df-fe17-42ec-99da-d1187d674051" containerID="fc980afd46cf9d25cf756a6876af0ccecbed294fda22f489208963d9bc262001" exitCode=0 Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.719056 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kpsb" event={"ID":"484933df-fe17-42ec-99da-d1187d674051","Type":"ContainerDied","Data":"fc980afd46cf9d25cf756a6876af0ccecbed294fda22f489208963d9bc262001"} Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.721915 4580 generic.go:334] "Generic (PLEG): container finished" podID="82874992-faa8-4c73-955b-ffe5f02726a7" containerID="92c7faaec51f14fd1e7fdcaf810c546556533e202c39331b82a5f2bfa03c49b4" exitCode=0 Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.721975 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n99sq" event={"ID":"82874992-faa8-4c73-955b-ffe5f02726a7","Type":"ContainerDied","Data":"92c7faaec51f14fd1e7fdcaf810c546556533e202c39331b82a5f2bfa03c49b4"} Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.725907 4580 generic.go:334] "Generic (PLEG): container finished" podID="37b3e873-7ca5-4413-9998-6aaf824d6cd7" containerID="0e74c3fade6a8363e1e513360c03f6b271c3fe659dcf4cd9684fc5169792e54d" exitCode=0 Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.725955 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77nmx" event={"ID":"37b3e873-7ca5-4413-9998-6aaf824d6cd7","Type":"ContainerDied","Data":"0e74c3fade6a8363e1e513360c03f6b271c3fe659dcf4cd9684fc5169792e54d"} Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.764237 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.766036 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.903286 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.968279 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2dsq\" (UniqueName: \"kubernetes.io/projected/37b3e873-7ca5-4413-9998-6aaf824d6cd7-kube-api-access-l2dsq\") pod \"37b3e873-7ca5-4413-9998-6aaf824d6cd7\" (UID: \"37b3e873-7ca5-4413-9998-6aaf824d6cd7\") " Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.968344 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b3e873-7ca5-4413-9998-6aaf824d6cd7-utilities\") pod \"37b3e873-7ca5-4413-9998-6aaf824d6cd7\" (UID: \"37b3e873-7ca5-4413-9998-6aaf824d6cd7\") " Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.968403 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b3e873-7ca5-4413-9998-6aaf824d6cd7-catalog-content\") pod \"37b3e873-7ca5-4413-9998-6aaf824d6cd7\" (UID: \"37b3e873-7ca5-4413-9998-6aaf824d6cd7\") " Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.969329 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37b3e873-7ca5-4413-9998-6aaf824d6cd7-utilities" (OuterVolumeSpecName: "utilities") pod "37b3e873-7ca5-4413-9998-6aaf824d6cd7" (UID: "37b3e873-7ca5-4413-9998-6aaf824d6cd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:58:17 crc kubenswrapper[4580]: I0321 04:58:17.975574 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b3e873-7ca5-4413-9998-6aaf824d6cd7-kube-api-access-l2dsq" (OuterVolumeSpecName: "kube-api-access-l2dsq") pod "37b3e873-7ca5-4413-9998-6aaf824d6cd7" (UID: "37b3e873-7ca5-4413-9998-6aaf824d6cd7"). InnerVolumeSpecName "kube-api-access-l2dsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.056968 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.069625 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2dsq\" (UniqueName: \"kubernetes.io/projected/37b3e873-7ca5-4413-9998-6aaf824d6cd7-kube-api-access-l2dsq\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.069667 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b3e873-7ca5-4413-9998-6aaf824d6cd7-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.073111 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37b3e873-7ca5-4413-9998-6aaf824d6cd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37b3e873-7ca5-4413-9998-6aaf824d6cd7" (UID: "37b3e873-7ca5-4413-9998-6aaf824d6cd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.158220 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.171752 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b3e873-7ca5-4413-9998-6aaf824d6cd7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.211726 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.222990 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.279183 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbthv\" (UniqueName: \"kubernetes.io/projected/82874992-faa8-4c73-955b-ffe5f02726a7-kube-api-access-pbthv\") pod \"82874992-faa8-4c73-955b-ffe5f02726a7\" (UID: \"82874992-faa8-4c73-955b-ffe5f02726a7\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.279875 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-utilities\") pod \"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c\" (UID: \"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.279933 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmlgk\" (UniqueName: \"kubernetes.io/projected/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-kube-api-access-kmlgk\") pod \"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c\" (UID: \"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.279959 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d1b089c-8016-458b-83b5-84f602ea0ba7-marketplace-trusted-ca\") pod \"8d1b089c-8016-458b-83b5-84f602ea0ba7\" (UID: \"8d1b089c-8016-458b-83b5-84f602ea0ba7\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.279986 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-catalog-content\") pod \"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c\" (UID: \"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.280039 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8d1b089c-8016-458b-83b5-84f602ea0ba7-marketplace-operator-metrics\") pod \"8d1b089c-8016-458b-83b5-84f602ea0ba7\" (UID: \"8d1b089c-8016-458b-83b5-84f602ea0ba7\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.280071 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82874992-faa8-4c73-955b-ffe5f02726a7-catalog-content\") pod \"82874992-faa8-4c73-955b-ffe5f02726a7\" (UID: \"82874992-faa8-4c73-955b-ffe5f02726a7\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.280102 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzv59\" (UniqueName: \"kubernetes.io/projected/8d1b089c-8016-458b-83b5-84f602ea0ba7-kube-api-access-vzv59\") pod \"8d1b089c-8016-458b-83b5-84f602ea0ba7\" (UID: \"8d1b089c-8016-458b-83b5-84f602ea0ba7\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.280153 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82874992-faa8-4c73-955b-ffe5f02726a7-utilities\") pod \"82874992-faa8-4c73-955b-ffe5f02726a7\" (UID: \"82874992-faa8-4c73-955b-ffe5f02726a7\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.282269 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82874992-faa8-4c73-955b-ffe5f02726a7-utilities" (OuterVolumeSpecName: "utilities") pod "82874992-faa8-4c73-955b-ffe5f02726a7" (UID: "82874992-faa8-4c73-955b-ffe5f02726a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.282968 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82874992-faa8-4c73-955b-ffe5f02726a7-kube-api-access-pbthv" (OuterVolumeSpecName: "kube-api-access-pbthv") pod "82874992-faa8-4c73-955b-ffe5f02726a7" (UID: "82874992-faa8-4c73-955b-ffe5f02726a7"). InnerVolumeSpecName "kube-api-access-pbthv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.287920 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-utilities" (OuterVolumeSpecName: "utilities") pod "bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" (UID: "bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.288980 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d1b089c-8016-458b-83b5-84f602ea0ba7-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "8d1b089c-8016-458b-83b5-84f602ea0ba7" (UID: "8d1b089c-8016-458b-83b5-84f602ea0ba7"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.289236 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.290827 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-kube-api-access-kmlgk" (OuterVolumeSpecName: "kube-api-access-kmlgk") pod "bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" (UID: "bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c"). InnerVolumeSpecName "kube-api-access-kmlgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.296586 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1b089c-8016-458b-83b5-84f602ea0ba7-kube-api-access-vzv59" (OuterVolumeSpecName: "kube-api-access-vzv59") pod "8d1b089c-8016-458b-83b5-84f602ea0ba7" (UID: "8d1b089c-8016-458b-83b5-84f602ea0ba7"). InnerVolumeSpecName "kube-api-access-vzv59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.314040 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1b089c-8016-458b-83b5-84f602ea0ba7-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "8d1b089c-8016-458b-83b5-84f602ea0ba7" (UID: "8d1b089c-8016-458b-83b5-84f602ea0ba7"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.331265 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.383763 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnx45\" (UniqueName: \"kubernetes.io/projected/9940b0fa-e788-4da2-af4f-da4cdc60f12d-kube-api-access-vnx45\") pod \"9940b0fa-e788-4da2-af4f-da4cdc60f12d\" (UID: \"9940b0fa-e788-4da2-af4f-da4cdc60f12d\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.384550 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/484933df-fe17-42ec-99da-d1187d674051-utilities\") pod \"484933df-fe17-42ec-99da-d1187d674051\" (UID: \"484933df-fe17-42ec-99da-d1187d674051\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.384655 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9940b0fa-e788-4da2-af4f-da4cdc60f12d-utilities\") pod \"9940b0fa-e788-4da2-af4f-da4cdc60f12d\" (UID: \"9940b0fa-e788-4da2-af4f-da4cdc60f12d\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.384718 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpp8l\" (UniqueName: \"kubernetes.io/projected/484933df-fe17-42ec-99da-d1187d674051-kube-api-access-gpp8l\") pod \"484933df-fe17-42ec-99da-d1187d674051\" (UID: \"484933df-fe17-42ec-99da-d1187d674051\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.384802 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/484933df-fe17-42ec-99da-d1187d674051-catalog-content\") pod \"484933df-fe17-42ec-99da-d1187d674051\" (UID: \"484933df-fe17-42ec-99da-d1187d674051\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.384839 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9940b0fa-e788-4da2-af4f-da4cdc60f12d-catalog-content\") pod \"9940b0fa-e788-4da2-af4f-da4cdc60f12d\" (UID: \"9940b0fa-e788-4da2-af4f-da4cdc60f12d\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.385415 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbthv\" (UniqueName: \"kubernetes.io/projected/82874992-faa8-4c73-955b-ffe5f02726a7-kube-api-access-pbthv\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.385443 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.385455 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmlgk\" (UniqueName: \"kubernetes.io/projected/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-kube-api-access-kmlgk\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.385465 4580 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d1b089c-8016-458b-83b5-84f602ea0ba7-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.385494 4580 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8d1b089c-8016-458b-83b5-84f602ea0ba7-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.385505 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzv59\" (UniqueName: \"kubernetes.io/projected/8d1b089c-8016-458b-83b5-84f602ea0ba7-kube-api-access-vzv59\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.385516 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82874992-faa8-4c73-955b-ffe5f02726a7-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.387064 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/484933df-fe17-42ec-99da-d1187d674051-utilities" (OuterVolumeSpecName: "utilities") pod "484933df-fe17-42ec-99da-d1187d674051" (UID: "484933df-fe17-42ec-99da-d1187d674051"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.388165 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9940b0fa-e788-4da2-af4f-da4cdc60f12d-utilities" (OuterVolumeSpecName: "utilities") pod "9940b0fa-e788-4da2-af4f-da4cdc60f12d" (UID: "9940b0fa-e788-4da2-af4f-da4cdc60f12d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.395998 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484933df-fe17-42ec-99da-d1187d674051-kube-api-access-gpp8l" (OuterVolumeSpecName: "kube-api-access-gpp8l") pod "484933df-fe17-42ec-99da-d1187d674051" (UID: "484933df-fe17-42ec-99da-d1187d674051"). InnerVolumeSpecName "kube-api-access-gpp8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.397739 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9940b0fa-e788-4da2-af4f-da4cdc60f12d-kube-api-access-vnx45" (OuterVolumeSpecName: "kube-api-access-vnx45") pod "9940b0fa-e788-4da2-af4f-da4cdc60f12d" (UID: "9940b0fa-e788-4da2-af4f-da4cdc60f12d"). InnerVolumeSpecName "kube-api-access-vnx45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.397931 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82874992-faa8-4c73-955b-ffe5f02726a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82874992-faa8-4c73-955b-ffe5f02726a7" (UID: "82874992-faa8-4c73-955b-ffe5f02726a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.451461 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/484933df-fe17-42ec-99da-d1187d674051-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "484933df-fe17-42ec-99da-d1187d674051" (UID: "484933df-fe17-42ec-99da-d1187d674051"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.490426 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnx45\" (UniqueName: \"kubernetes.io/projected/9940b0fa-e788-4da2-af4f-da4cdc60f12d-kube-api-access-vnx45\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.490513 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/484933df-fe17-42ec-99da-d1187d674051-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.490993 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9940b0fa-e788-4da2-af4f-da4cdc60f12d-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.491007 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpp8l\" (UniqueName: \"kubernetes.io/projected/484933df-fe17-42ec-99da-d1187d674051-kube-api-access-gpp8l\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.491018 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82874992-faa8-4c73-955b-ffe5f02726a7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.491075 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/484933df-fe17-42ec-99da-d1187d674051-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.513840 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.517020 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p95jn"] Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.544625 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" (UID: "bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.565352 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.592933 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.605007 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9940b0fa-e788-4da2-af4f-da4cdc60f12d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9940b0fa-e788-4da2-af4f-da4cdc60f12d" (UID: "9940b0fa-e788-4da2-af4f-da4cdc60f12d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.675509 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.675602 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.693971 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9940b0fa-e788-4da2-af4f-da4cdc60f12d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.734736 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kk5nv" event={"ID":"9940b0fa-e788-4da2-af4f-da4cdc60f12d","Type":"ContainerDied","Data":"0902f30bc0fdb7c2d6d72fd627e107268e9b1ffa1848e962702c70cc3d37f7e9"} Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.734824 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kk5nv" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.735205 4580 scope.go:117] "RemoveContainer" containerID="8b056b7426c8b8da087bba83b7bdcf3efa8446cc8e7a0d2e9bb1c7e7ab8b7ef6" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.741743 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kpsb" event={"ID":"484933df-fe17-42ec-99da-d1187d674051","Type":"ContainerDied","Data":"9fec92962e0748371e579a57393f153a9f931600cc6de8afa90c515e13e2c767"} Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.741816 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4kpsb" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.746554 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n99sq" event={"ID":"82874992-faa8-4c73-955b-ffe5f02726a7","Type":"ContainerDied","Data":"f88e379f490605aba0768efdb6440d644a39f136b6d9d5e331fb19d954796f98"} Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.747158 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n99sq" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.752937 4580 scope.go:117] "RemoveContainer" containerID="b7b7210d8eda8a5bc1a45ecc3459e7cbbabe68d78761b7f78455d143a8965598" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.755110 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77nmx" event={"ID":"37b3e873-7ca5-4413-9998-6aaf824d6cd7","Type":"ContainerDied","Data":"024cf1e5452c36fa48642b9243e039270f6e1de9264d722387e731b466b35c75"} Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.755369 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77nmx" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.756867 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" event={"ID":"6ad5649a-1bef-41a6-aeaa-73f2850df16a","Type":"ContainerStarted","Data":"21d6659b34d2bcb47dd9e366fbeccf78606b5116e0f5c458b4286975429c13d1"} Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.756903 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" event={"ID":"6ad5649a-1bef-41a6-aeaa-73f2850df16a","Type":"ContainerStarted","Data":"5075e893a3d3ddc05abb234d81e2dcb053fde4cc95cdf73f0a2d63897fdd77d1"} Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.757653 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.761000 4580 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p95jn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.761078 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" podUID="6ad5649a-1bef-41a6-aeaa-73f2850df16a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.763193 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" event={"ID":"8d1b089c-8016-458b-83b5-84f602ea0ba7","Type":"ContainerDied","Data":"3d20e3ae0c79ed7c02d855a4aab846999212f3095fa15da782d453cdf873e181"} Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.763964 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hnl8s" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.775910 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ns8gg" event={"ID":"bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c","Type":"ContainerDied","Data":"a88f19f1b70285467721c81d613fc450142c5bf967175eadb3818a855548af7d"} Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.776075 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ns8gg" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.779370 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.779418 4580 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="dcd87c17f1fa3eaa4e310eb4239b6fbb6de2c290248f4b58578f9348e8c29a9f" exitCode=137 Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.779506 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.793991 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" podStartSLOduration=1.793966852 podStartE2EDuration="1.793966852s" podCreationTimestamp="2026-03-21 04:58:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:58:18.784759449 +0000 UTC m=+403.867343077" watchObservedRunningTime="2026-03-21 04:58:18.793966852 +0000 UTC m=+403.876550480" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.795092 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.795182 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.795234 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.795272 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.795332 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.795703 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.795865 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.796261 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.797348 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.799493 4580 scope.go:117] "RemoveContainer" containerID="1cd15adb1c59d9d1b28ee18dda8dc78af7c5242ef08704cbdc0b2f50fcb364a7" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.813110 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.820484 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4kpsb"] Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.830468 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4kpsb"] Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.839491 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kk5nv"] Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.847277 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kk5nv"] Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.856990 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n99sq"] Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.859142 4580 scope.go:117] "RemoveContainer" containerID="fc980afd46cf9d25cf756a6876af0ccecbed294fda22f489208963d9bc262001" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.864195 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.879240 4580 scope.go:117] "RemoveContainer" containerID="d56fcd27d3ef488fd44c2f25e326ca8c29f0c152404adb1855d11cafe0ae7519" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.888444 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n99sq"] Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.894277 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-77nmx"] Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.896641 4580 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.896684 4580 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.896700 4580 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.896712 4580 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.896723 4580 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.898225 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-77nmx"] Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.901726 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hnl8s"] Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.905051 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hnl8s"] Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.908129 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ns8gg"] Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.909188 4580 scope.go:117] "RemoveContainer" containerID="5ec157e076f2e1e8f1f17c3ddf48f934757a7ab4c6782b6ad836e57f9fbc65a1" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.910836 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ns8gg"] Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.941893 4580 scope.go:117] "RemoveContainer" containerID="92c7faaec51f14fd1e7fdcaf810c546556533e202c39331b82a5f2bfa03c49b4" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.965003 4580 scope.go:117] "RemoveContainer" containerID="4728e5e044a69f7890b69dccb2c0cba61649a5a887999c873e5ca47b245278b4" Mar 21 04:58:18 crc kubenswrapper[4580]: I0321 04:58:18.999899 4580 scope.go:117] "RemoveContainer" containerID="7ddd64ac0170369ce14acc7b7a9dbf7864d4b0a7c9c40255d2987b2e1bdf9a65" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.020432 4580 scope.go:117] "RemoveContainer" containerID="0e74c3fade6a8363e1e513360c03f6b271c3fe659dcf4cd9684fc5169792e54d" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.036040 4580 scope.go:117] "RemoveContainer" containerID="8806b5164dc3cf5c683b86feded2e3841b19e481958f8f1de36e749a547fe9b5" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.060282 4580 scope.go:117] "RemoveContainer" containerID="bd7ce93da37447c55a0d714fb6aae1b9dddea5562eacf49408b0c54a19a322f2" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.082272 4580 scope.go:117] "RemoveContainer" containerID="f330ce73140deb65d3b48634f1b3afe76b09042aa1b936265d66878f92f36b12" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.102649 4580 scope.go:117] "RemoveContainer" containerID="2906bf6ad16aee4244bf07292c53b40e9340abb3618410bf10d6664a6ee01b68" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.126871 4580 scope.go:117] "RemoveContainer" containerID="8967516a25d13542594605c5e175e5e2ad35bd1f4463bf5bf652881a8e972782" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.156428 4580 scope.go:117] "RemoveContainer" containerID="a30f7c887f16080c0beefee2cd736ea22bf42f8eada48466c4890bb0c6c48275" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.176030 4580 scope.go:117] "RemoveContainer" containerID="dcd87c17f1fa3eaa4e310eb4239b6fbb6de2c290248f4b58578f9348e8c29a9f" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.192326 4580 scope.go:117] "RemoveContainer" containerID="dcd87c17f1fa3eaa4e310eb4239b6fbb6de2c290248f4b58578f9348e8c29a9f" Mar 21 04:58:19 crc kubenswrapper[4580]: E0321 04:58:19.193572 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcd87c17f1fa3eaa4e310eb4239b6fbb6de2c290248f4b58578f9348e8c29a9f\": container with ID starting with dcd87c17f1fa3eaa4e310eb4239b6fbb6de2c290248f4b58578f9348e8c29a9f not found: ID does not exist" containerID="dcd87c17f1fa3eaa4e310eb4239b6fbb6de2c290248f4b58578f9348e8c29a9f" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.193626 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd87c17f1fa3eaa4e310eb4239b6fbb6de2c290248f4b58578f9348e8c29a9f"} err="failed to get container status \"dcd87c17f1fa3eaa4e310eb4239b6fbb6de2c290248f4b58578f9348e8c29a9f\": rpc error: code = NotFound desc = could not find container \"dcd87c17f1fa3eaa4e310eb4239b6fbb6de2c290248f4b58578f9348e8c29a9f\": container with ID starting with dcd87c17f1fa3eaa4e310eb4239b6fbb6de2c290248f4b58578f9348e8c29a9f not found: ID does not exist" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.626210 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b3e873-7ca5-4413-9998-6aaf824d6cd7" path="/var/lib/kubelet/pods/37b3e873-7ca5-4413-9998-6aaf824d6cd7/volumes" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.627815 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484933df-fe17-42ec-99da-d1187d674051" path="/var/lib/kubelet/pods/484933df-fe17-42ec-99da-d1187d674051/volumes" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.629572 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82874992-faa8-4c73-955b-ffe5f02726a7" path="/var/lib/kubelet/pods/82874992-faa8-4c73-955b-ffe5f02726a7/volumes" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.631280 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d1b089c-8016-458b-83b5-84f602ea0ba7" path="/var/lib/kubelet/pods/8d1b089c-8016-458b-83b5-84f602ea0ba7/volumes" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.631739 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" path="/var/lib/kubelet/pods/9940b0fa-e788-4da2-af4f-da4cdc60f12d/volumes" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.632858 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" path="/var/lib/kubelet/pods/bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c/volumes" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.633426 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.633679 4580 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.650550 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.650682 4580 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3d2f9eb6-4381-4af3-9df3-c8cb0e653db1" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.669044 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.669097 4580 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3d2f9eb6-4381-4af3-9df3-c8cb0e653db1" Mar 21 04:58:19 crc kubenswrapper[4580]: I0321 04:58:19.798299 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p95jn" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.879248 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7b296"] Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880121 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b3e873-7ca5-4413-9998-6aaf824d6cd7" containerName="extract-utilities" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880138 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b3e873-7ca5-4413-9998-6aaf824d6cd7" containerName="extract-utilities" Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880151 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" containerName="registry-server" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880159 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" containerName="registry-server" Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880173 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b3e873-7ca5-4413-9998-6aaf824d6cd7" containerName="extract-content" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880180 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b3e873-7ca5-4413-9998-6aaf824d6cd7" containerName="extract-content" Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880189 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" containerName="extract-content" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880198 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" containerName="extract-content" Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880208 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82874992-faa8-4c73-955b-ffe5f02726a7" containerName="extract-content" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880216 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="82874992-faa8-4c73-955b-ffe5f02726a7" containerName="extract-content" Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880224 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82874992-faa8-4c73-955b-ffe5f02726a7" containerName="extract-utilities" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880230 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="82874992-faa8-4c73-955b-ffe5f02726a7" containerName="extract-utilities" Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880239 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" containerName="registry-server" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880246 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" containerName="registry-server" Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880258 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" containerName="extract-utilities" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880265 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" containerName="extract-utilities" Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880276 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82874992-faa8-4c73-955b-ffe5f02726a7" containerName="registry-server" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880283 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="82874992-faa8-4c73-955b-ffe5f02726a7" containerName="registry-server" Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880292 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" containerName="extract-utilities" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880299 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" containerName="extract-utilities" Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880311 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1b089c-8016-458b-83b5-84f602ea0ba7" containerName="marketplace-operator" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880319 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1b089c-8016-458b-83b5-84f602ea0ba7" containerName="marketplace-operator" Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880329 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484933df-fe17-42ec-99da-d1187d674051" containerName="registry-server" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880336 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="484933df-fe17-42ec-99da-d1187d674051" containerName="registry-server" Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880344 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b3e873-7ca5-4413-9998-6aaf824d6cd7" containerName="registry-server" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880351 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b3e873-7ca5-4413-9998-6aaf824d6cd7" containerName="registry-server" Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880362 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" containerName="extract-content" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880369 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" containerName="extract-content" Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880378 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484933df-fe17-42ec-99da-d1187d674051" containerName="extract-content" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880386 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="484933df-fe17-42ec-99da-d1187d674051" containerName="extract-content" Mar 21 04:58:55 crc kubenswrapper[4580]: E0321 04:58:55.880396 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484933df-fe17-42ec-99da-d1187d674051" containerName="extract-utilities" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880406 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="484933df-fe17-42ec-99da-d1187d674051" containerName="extract-utilities" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880525 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="9940b0fa-e788-4da2-af4f-da4cdc60f12d" containerName="registry-server" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880538 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="484933df-fe17-42ec-99da-d1187d674051" containerName="registry-server" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880551 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="82874992-faa8-4c73-955b-ffe5f02726a7" containerName="registry-server" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880561 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="bff4e8c9-6b8d-44cf-8c34-2e5b2f5e4f3c" containerName="registry-server" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880573 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1b089c-8016-458b-83b5-84f602ea0ba7" containerName="marketplace-operator" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.880581 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b3e873-7ca5-4413-9998-6aaf824d6cd7" containerName="registry-server" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.881125 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.893601 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7b296"] Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.998311 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd9aed0f-3742-486c-93f4-866b8330e572-trusted-ca\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.998375 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd9aed0f-3742-486c-93f4-866b8330e572-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.998403 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgnsj\" (UniqueName: \"kubernetes.io/projected/cd9aed0f-3742-486c-93f4-866b8330e572-kube-api-access-zgnsj\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.998478 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd9aed0f-3742-486c-93f4-866b8330e572-registry-certificates\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.999029 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.999173 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd9aed0f-3742-486c-93f4-866b8330e572-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.999256 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd9aed0f-3742-486c-93f4-866b8330e572-registry-tls\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:55 crc kubenswrapper[4580]: I0321 04:58:55.999305 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd9aed0f-3742-486c-93f4-866b8330e572-bound-sa-token\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.038726 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.100861 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd9aed0f-3742-486c-93f4-866b8330e572-registry-tls\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.100924 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd9aed0f-3742-486c-93f4-866b8330e572-bound-sa-token\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.100984 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd9aed0f-3742-486c-93f4-866b8330e572-trusted-ca\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.101015 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd9aed0f-3742-486c-93f4-866b8330e572-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.101045 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgnsj\" (UniqueName: \"kubernetes.io/projected/cd9aed0f-3742-486c-93f4-866b8330e572-kube-api-access-zgnsj\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.101091 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd9aed0f-3742-486c-93f4-866b8330e572-registry-certificates\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.101136 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd9aed0f-3742-486c-93f4-866b8330e572-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.102908 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd9aed0f-3742-486c-93f4-866b8330e572-registry-certificates\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.103061 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd9aed0f-3742-486c-93f4-866b8330e572-trusted-ca\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.103108 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd9aed0f-3742-486c-93f4-866b8330e572-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.122431 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd9aed0f-3742-486c-93f4-866b8330e572-registry-tls\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.127086 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd9aed0f-3742-486c-93f4-866b8330e572-bound-sa-token\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.128678 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd9aed0f-3742-486c-93f4-866b8330e572-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.129340 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgnsj\" (UniqueName: \"kubernetes.io/projected/cd9aed0f-3742-486c-93f4-866b8330e572-kube-api-access-zgnsj\") pod \"image-registry-66df7c8f76-7b296\" (UID: \"cd9aed0f-3742-486c-93f4-866b8330e572\") " pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.206381 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:58:56 crc kubenswrapper[4580]: I0321 04:58:56.645578 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7b296"] Mar 21 04:58:57 crc kubenswrapper[4580]: I0321 04:58:57.037114 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7b296" event={"ID":"cd9aed0f-3742-486c-93f4-866b8330e572","Type":"ContainerStarted","Data":"a5220404fc62f16273c6b2b94de31ab01a63ecbc9008d9f44b31b2fe5a158494"} Mar 21 04:58:57 crc kubenswrapper[4580]: I0321 04:58:57.037532 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7b296" event={"ID":"cd9aed0f-3742-486c-93f4-866b8330e572","Type":"ContainerStarted","Data":"edeab41a63f98b7b4dbcffa0fbbae10dff2609a03b386f72fc261271c2e02742"} Mar 21 04:58:57 crc kubenswrapper[4580]: I0321 04:58:57.037560 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.560457 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7b296" podStartSLOduration=6.560436708 podStartE2EDuration="6.560436708s" podCreationTimestamp="2026-03-21 04:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:58:57.066590207 +0000 UTC m=+442.149173845" watchObservedRunningTime="2026-03-21 04:59:01.560436708 +0000 UTC m=+446.643020336" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.562126 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ggsnm"] Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.563356 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggsnm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.565354 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.573862 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggsnm"] Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.595897 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2307470f-41d9-48a7-bfce-d96d8a13c568-utilities\") pod \"redhat-marketplace-ggsnm\" (UID: \"2307470f-41d9-48a7-bfce-d96d8a13c568\") " pod="openshift-marketplace/redhat-marketplace-ggsnm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.595946 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xllbd\" (UniqueName: \"kubernetes.io/projected/2307470f-41d9-48a7-bfce-d96d8a13c568-kube-api-access-xllbd\") pod \"redhat-marketplace-ggsnm\" (UID: \"2307470f-41d9-48a7-bfce-d96d8a13c568\") " pod="openshift-marketplace/redhat-marketplace-ggsnm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.596028 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2307470f-41d9-48a7-bfce-d96d8a13c568-catalog-content\") pod \"redhat-marketplace-ggsnm\" (UID: \"2307470f-41d9-48a7-bfce-d96d8a13c568\") " pod="openshift-marketplace/redhat-marketplace-ggsnm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.697499 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2307470f-41d9-48a7-bfce-d96d8a13c568-catalog-content\") pod \"redhat-marketplace-ggsnm\" (UID: \"2307470f-41d9-48a7-bfce-d96d8a13c568\") " pod="openshift-marketplace/redhat-marketplace-ggsnm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.697629 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2307470f-41d9-48a7-bfce-d96d8a13c568-utilities\") pod \"redhat-marketplace-ggsnm\" (UID: \"2307470f-41d9-48a7-bfce-d96d8a13c568\") " pod="openshift-marketplace/redhat-marketplace-ggsnm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.697663 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xllbd\" (UniqueName: \"kubernetes.io/projected/2307470f-41d9-48a7-bfce-d96d8a13c568-kube-api-access-xllbd\") pod \"redhat-marketplace-ggsnm\" (UID: \"2307470f-41d9-48a7-bfce-d96d8a13c568\") " pod="openshift-marketplace/redhat-marketplace-ggsnm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.698209 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2307470f-41d9-48a7-bfce-d96d8a13c568-catalog-content\") pod \"redhat-marketplace-ggsnm\" (UID: \"2307470f-41d9-48a7-bfce-d96d8a13c568\") " pod="openshift-marketplace/redhat-marketplace-ggsnm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.698827 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2307470f-41d9-48a7-bfce-d96d8a13c568-utilities\") pod \"redhat-marketplace-ggsnm\" (UID: \"2307470f-41d9-48a7-bfce-d96d8a13c568\") " pod="openshift-marketplace/redhat-marketplace-ggsnm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.721974 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xllbd\" (UniqueName: \"kubernetes.io/projected/2307470f-41d9-48a7-bfce-d96d8a13c568-kube-api-access-xllbd\") pod \"redhat-marketplace-ggsnm\" (UID: \"2307470f-41d9-48a7-bfce-d96d8a13c568\") " pod="openshift-marketplace/redhat-marketplace-ggsnm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.757124 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gbbtm"] Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.758442 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gbbtm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.761953 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.773947 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gbbtm"] Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.799031 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wchq5\" (UniqueName: \"kubernetes.io/projected/fa2c2b42-95d7-47dd-b5c3-47ef8689c50c-kube-api-access-wchq5\") pod \"community-operators-gbbtm\" (UID: \"fa2c2b42-95d7-47dd-b5c3-47ef8689c50c\") " pod="openshift-marketplace/community-operators-gbbtm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.799080 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2c2b42-95d7-47dd-b5c3-47ef8689c50c-utilities\") pod \"community-operators-gbbtm\" (UID: \"fa2c2b42-95d7-47dd-b5c3-47ef8689c50c\") " pod="openshift-marketplace/community-operators-gbbtm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.799115 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2c2b42-95d7-47dd-b5c3-47ef8689c50c-catalog-content\") pod \"community-operators-gbbtm\" (UID: \"fa2c2b42-95d7-47dd-b5c3-47ef8689c50c\") " pod="openshift-marketplace/community-operators-gbbtm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.895942 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggsnm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.900510 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wchq5\" (UniqueName: \"kubernetes.io/projected/fa2c2b42-95d7-47dd-b5c3-47ef8689c50c-kube-api-access-wchq5\") pod \"community-operators-gbbtm\" (UID: \"fa2c2b42-95d7-47dd-b5c3-47ef8689c50c\") " pod="openshift-marketplace/community-operators-gbbtm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.900571 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2c2b42-95d7-47dd-b5c3-47ef8689c50c-utilities\") pod \"community-operators-gbbtm\" (UID: \"fa2c2b42-95d7-47dd-b5c3-47ef8689c50c\") " pod="openshift-marketplace/community-operators-gbbtm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.900612 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2c2b42-95d7-47dd-b5c3-47ef8689c50c-catalog-content\") pod \"community-operators-gbbtm\" (UID: \"fa2c2b42-95d7-47dd-b5c3-47ef8689c50c\") " pod="openshift-marketplace/community-operators-gbbtm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.901204 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2c2b42-95d7-47dd-b5c3-47ef8689c50c-catalog-content\") pod \"community-operators-gbbtm\" (UID: \"fa2c2b42-95d7-47dd-b5c3-47ef8689c50c\") " pod="openshift-marketplace/community-operators-gbbtm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.901263 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2c2b42-95d7-47dd-b5c3-47ef8689c50c-utilities\") pod \"community-operators-gbbtm\" (UID: \"fa2c2b42-95d7-47dd-b5c3-47ef8689c50c\") " pod="openshift-marketplace/community-operators-gbbtm" Mar 21 04:59:01 crc kubenswrapper[4580]: I0321 04:59:01.927169 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wchq5\" (UniqueName: \"kubernetes.io/projected/fa2c2b42-95d7-47dd-b5c3-47ef8689c50c-kube-api-access-wchq5\") pod \"community-operators-gbbtm\" (UID: \"fa2c2b42-95d7-47dd-b5c3-47ef8689c50c\") " pod="openshift-marketplace/community-operators-gbbtm" Mar 21 04:59:02 crc kubenswrapper[4580]: I0321 04:59:02.078744 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gbbtm" Mar 21 04:59:02 crc kubenswrapper[4580]: I0321 04:59:02.158682 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggsnm"] Mar 21 04:59:02 crc kubenswrapper[4580]: I0321 04:59:02.956303 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gbbtm"] Mar 21 04:59:02 crc kubenswrapper[4580]: W0321 04:59:02.971389 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa2c2b42_95d7_47dd_b5c3_47ef8689c50c.slice/crio-c26c17c9d7c1e95c687e75924371da79bde69c1f5ed755e2d702a203265e3c9c WatchSource:0}: Error finding container c26c17c9d7c1e95c687e75924371da79bde69c1f5ed755e2d702a203265e3c9c: Status 404 returned error can't find the container with id c26c17c9d7c1e95c687e75924371da79bde69c1f5ed755e2d702a203265e3c9c Mar 21 04:59:03 crc kubenswrapper[4580]: I0321 04:59:03.075522 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbbtm" event={"ID":"fa2c2b42-95d7-47dd-b5c3-47ef8689c50c","Type":"ContainerStarted","Data":"c26c17c9d7c1e95c687e75924371da79bde69c1f5ed755e2d702a203265e3c9c"} Mar 21 04:59:03 crc kubenswrapper[4580]: I0321 04:59:03.077754 4580 generic.go:334] "Generic (PLEG): container finished" podID="2307470f-41d9-48a7-bfce-d96d8a13c568" containerID="1c0a6bb66a85d739de357b690e54dbb4484424d1adfa1620743448c4cb2bb7c1" exitCode=0 Mar 21 04:59:03 crc kubenswrapper[4580]: I0321 04:59:03.077813 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggsnm" event={"ID":"2307470f-41d9-48a7-bfce-d96d8a13c568","Type":"ContainerDied","Data":"1c0a6bb66a85d739de357b690e54dbb4484424d1adfa1620743448c4cb2bb7c1"} Mar 21 04:59:03 crc kubenswrapper[4580]: I0321 04:59:03.077864 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggsnm" event={"ID":"2307470f-41d9-48a7-bfce-d96d8a13c568","Type":"ContainerStarted","Data":"9f32f846dc3ce4ea7887b239e226cdb4e5ddcd39d18f977a8d57ece526c39d59"} Mar 21 04:59:03 crc kubenswrapper[4580]: I0321 04:59:03.959424 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dmvk5"] Mar 21 04:59:03 crc kubenswrapper[4580]: I0321 04:59:03.962035 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmvk5" Mar 21 04:59:03 crc kubenswrapper[4580]: I0321 04:59:03.965669 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 21 04:59:03 crc kubenswrapper[4580]: I0321 04:59:03.987243 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dmvk5"] Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.035683 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48b0cf7d-100c-4953-88d7-c4775e45c45d-catalog-content\") pod \"redhat-operators-dmvk5\" (UID: \"48b0cf7d-100c-4953-88d7-c4775e45c45d\") " pod="openshift-marketplace/redhat-operators-dmvk5" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.035924 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48b0cf7d-100c-4953-88d7-c4775e45c45d-utilities\") pod \"redhat-operators-dmvk5\" (UID: \"48b0cf7d-100c-4953-88d7-c4775e45c45d\") " pod="openshift-marketplace/redhat-operators-dmvk5" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.036037 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5lnp\" (UniqueName: \"kubernetes.io/projected/48b0cf7d-100c-4953-88d7-c4775e45c45d-kube-api-access-c5lnp\") pod \"redhat-operators-dmvk5\" (UID: \"48b0cf7d-100c-4953-88d7-c4775e45c45d\") " pod="openshift-marketplace/redhat-operators-dmvk5" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.084079 4580 generic.go:334] "Generic (PLEG): container finished" podID="fa2c2b42-95d7-47dd-b5c3-47ef8689c50c" containerID="64caf94ad798919bc84ad61d4655b23c2faf0eb6ea3f2c7114038c7e6ee72bd4" exitCode=0 Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.084166 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbbtm" event={"ID":"fa2c2b42-95d7-47dd-b5c3-47ef8689c50c","Type":"ContainerDied","Data":"64caf94ad798919bc84ad61d4655b23c2faf0eb6ea3f2c7114038c7e6ee72bd4"} Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.086232 4580 generic.go:334] "Generic (PLEG): container finished" podID="2307470f-41d9-48a7-bfce-d96d8a13c568" containerID="200bdf04adc312d9517488cb59aa63f66c95d4d6ff747af8ffa4c3c7a103fd36" exitCode=0 Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.086390 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggsnm" event={"ID":"2307470f-41d9-48a7-bfce-d96d8a13c568","Type":"ContainerDied","Data":"200bdf04adc312d9517488cb59aa63f66c95d4d6ff747af8ffa4c3c7a103fd36"} Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.138099 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5lnp\" (UniqueName: \"kubernetes.io/projected/48b0cf7d-100c-4953-88d7-c4775e45c45d-kube-api-access-c5lnp\") pod \"redhat-operators-dmvk5\" (UID: \"48b0cf7d-100c-4953-88d7-c4775e45c45d\") " pod="openshift-marketplace/redhat-operators-dmvk5" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.139817 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48b0cf7d-100c-4953-88d7-c4775e45c45d-catalog-content\") pod \"redhat-operators-dmvk5\" (UID: \"48b0cf7d-100c-4953-88d7-c4775e45c45d\") " pod="openshift-marketplace/redhat-operators-dmvk5" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.139143 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48b0cf7d-100c-4953-88d7-c4775e45c45d-catalog-content\") pod \"redhat-operators-dmvk5\" (UID: \"48b0cf7d-100c-4953-88d7-c4775e45c45d\") " pod="openshift-marketplace/redhat-operators-dmvk5" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.140157 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48b0cf7d-100c-4953-88d7-c4775e45c45d-utilities\") pod \"redhat-operators-dmvk5\" (UID: \"48b0cf7d-100c-4953-88d7-c4775e45c45d\") " pod="openshift-marketplace/redhat-operators-dmvk5" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.140858 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48b0cf7d-100c-4953-88d7-c4775e45c45d-utilities\") pod \"redhat-operators-dmvk5\" (UID: \"48b0cf7d-100c-4953-88d7-c4775e45c45d\") " pod="openshift-marketplace/redhat-operators-dmvk5" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.158377 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ppl4t"] Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.159614 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ppl4t" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.161330 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.170636 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5lnp\" (UniqueName: \"kubernetes.io/projected/48b0cf7d-100c-4953-88d7-c4775e45c45d-kube-api-access-c5lnp\") pod \"redhat-operators-dmvk5\" (UID: \"48b0cf7d-100c-4953-88d7-c4775e45c45d\") " pod="openshift-marketplace/redhat-operators-dmvk5" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.177936 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ppl4t"] Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.241441 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/371f2b05-72f1-4289-b499-4490d84d0d38-catalog-content\") pod \"certified-operators-ppl4t\" (UID: \"371f2b05-72f1-4289-b499-4490d84d0d38\") " pod="openshift-marketplace/certified-operators-ppl4t" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.242018 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/371f2b05-72f1-4289-b499-4490d84d0d38-utilities\") pod \"certified-operators-ppl4t\" (UID: \"371f2b05-72f1-4289-b499-4490d84d0d38\") " pod="openshift-marketplace/certified-operators-ppl4t" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.242185 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4kn9\" (UniqueName: \"kubernetes.io/projected/371f2b05-72f1-4289-b499-4490d84d0d38-kube-api-access-q4kn9\") pod \"certified-operators-ppl4t\" (UID: \"371f2b05-72f1-4289-b499-4490d84d0d38\") " pod="openshift-marketplace/certified-operators-ppl4t" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.290266 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmvk5" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.343658 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4kn9\" (UniqueName: \"kubernetes.io/projected/371f2b05-72f1-4289-b499-4490d84d0d38-kube-api-access-q4kn9\") pod \"certified-operators-ppl4t\" (UID: \"371f2b05-72f1-4289-b499-4490d84d0d38\") " pod="openshift-marketplace/certified-operators-ppl4t" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.343736 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/371f2b05-72f1-4289-b499-4490d84d0d38-catalog-content\") pod \"certified-operators-ppl4t\" (UID: \"371f2b05-72f1-4289-b499-4490d84d0d38\") " pod="openshift-marketplace/certified-operators-ppl4t" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.343821 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/371f2b05-72f1-4289-b499-4490d84d0d38-utilities\") pod \"certified-operators-ppl4t\" (UID: \"371f2b05-72f1-4289-b499-4490d84d0d38\") " pod="openshift-marketplace/certified-operators-ppl4t" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.344311 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/371f2b05-72f1-4289-b499-4490d84d0d38-utilities\") pod \"certified-operators-ppl4t\" (UID: \"371f2b05-72f1-4289-b499-4490d84d0d38\") " pod="openshift-marketplace/certified-operators-ppl4t" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.344390 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/371f2b05-72f1-4289-b499-4490d84d0d38-catalog-content\") pod \"certified-operators-ppl4t\" (UID: \"371f2b05-72f1-4289-b499-4490d84d0d38\") " pod="openshift-marketplace/certified-operators-ppl4t" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.376394 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4kn9\" (UniqueName: \"kubernetes.io/projected/371f2b05-72f1-4289-b499-4490d84d0d38-kube-api-access-q4kn9\") pod \"certified-operators-ppl4t\" (UID: \"371f2b05-72f1-4289-b499-4490d84d0d38\") " pod="openshift-marketplace/certified-operators-ppl4t" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.508052 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ppl4t" Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.742486 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dmvk5"] Mar 21 04:59:04 crc kubenswrapper[4580]: I0321 04:59:04.981724 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ppl4t"] Mar 21 04:59:05 crc kubenswrapper[4580]: I0321 04:59:05.098513 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppl4t" event={"ID":"371f2b05-72f1-4289-b499-4490d84d0d38","Type":"ContainerStarted","Data":"29972a66be3f17070cd98ad94043ec042e6e34800bdd80184f34c2a08661228d"} Mar 21 04:59:05 crc kubenswrapper[4580]: I0321 04:59:05.103086 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbbtm" event={"ID":"fa2c2b42-95d7-47dd-b5c3-47ef8689c50c","Type":"ContainerStarted","Data":"f8b5e8a474a39b04cb561f7d7cd1b8a987c1e4fa8e6a73d63dccbda4856c90a6"} Mar 21 04:59:05 crc kubenswrapper[4580]: I0321 04:59:05.106271 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggsnm" event={"ID":"2307470f-41d9-48a7-bfce-d96d8a13c568","Type":"ContainerStarted","Data":"8893e489a1243bf5ee801ef1467bc081a9857c20ac92f04b92d824b8cdd32790"} Mar 21 04:59:05 crc kubenswrapper[4580]: I0321 04:59:05.114578 4580 generic.go:334] "Generic (PLEG): container finished" podID="48b0cf7d-100c-4953-88d7-c4775e45c45d" containerID="c9afaf4de1c1a392671335af6fb1ef238f7f80b4a9b12fcb22af9991d982b6a8" exitCode=0 Mar 21 04:59:05 crc kubenswrapper[4580]: I0321 04:59:05.114692 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmvk5" event={"ID":"48b0cf7d-100c-4953-88d7-c4775e45c45d","Type":"ContainerDied","Data":"c9afaf4de1c1a392671335af6fb1ef238f7f80b4a9b12fcb22af9991d982b6a8"} Mar 21 04:59:05 crc kubenswrapper[4580]: I0321 04:59:05.114883 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmvk5" event={"ID":"48b0cf7d-100c-4953-88d7-c4775e45c45d","Type":"ContainerStarted","Data":"23301e361cb2f6361f63e42e1b01592d51f077a89c3118e0cd1a1544ed030283"} Mar 21 04:59:05 crc kubenswrapper[4580]: I0321 04:59:05.164719 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ggsnm" podStartSLOduration=2.580648394 podStartE2EDuration="4.164701584s" podCreationTimestamp="2026-03-21 04:59:01 +0000 UTC" firstStartedPulling="2026-03-21 04:59:03.079330538 +0000 UTC m=+448.161914166" lastFinishedPulling="2026-03-21 04:59:04.663383728 +0000 UTC m=+449.745967356" observedRunningTime="2026-03-21 04:59:05.161478876 +0000 UTC m=+450.244062524" watchObservedRunningTime="2026-03-21 04:59:05.164701584 +0000 UTC m=+450.247285212" Mar 21 04:59:06 crc kubenswrapper[4580]: I0321 04:59:06.121333 4580 generic.go:334] "Generic (PLEG): container finished" podID="371f2b05-72f1-4289-b499-4490d84d0d38" containerID="7032a7499fad5dc0ccf3ab582fcac43dcff9780bda6b8044a49a373d6397d01f" exitCode=0 Mar 21 04:59:06 crc kubenswrapper[4580]: I0321 04:59:06.121373 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppl4t" event={"ID":"371f2b05-72f1-4289-b499-4490d84d0d38","Type":"ContainerDied","Data":"7032a7499fad5dc0ccf3ab582fcac43dcff9780bda6b8044a49a373d6397d01f"} Mar 21 04:59:06 crc kubenswrapper[4580]: I0321 04:59:06.126233 4580 generic.go:334] "Generic (PLEG): container finished" podID="fa2c2b42-95d7-47dd-b5c3-47ef8689c50c" containerID="f8b5e8a474a39b04cb561f7d7cd1b8a987c1e4fa8e6a73d63dccbda4856c90a6" exitCode=0 Mar 21 04:59:06 crc kubenswrapper[4580]: I0321 04:59:06.126543 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbbtm" event={"ID":"fa2c2b42-95d7-47dd-b5c3-47ef8689c50c","Type":"ContainerDied","Data":"f8b5e8a474a39b04cb561f7d7cd1b8a987c1e4fa8e6a73d63dccbda4856c90a6"} Mar 21 04:59:06 crc kubenswrapper[4580]: I0321 04:59:06.129201 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmvk5" event={"ID":"48b0cf7d-100c-4953-88d7-c4775e45c45d","Type":"ContainerStarted","Data":"1f935517d08f0bfb3b722875899592e2857212ee9bb23e0e6bc6a3da7b99bdc2"} Mar 21 04:59:07 crc kubenswrapper[4580]: I0321 04:59:07.141285 4580 generic.go:334] "Generic (PLEG): container finished" podID="48b0cf7d-100c-4953-88d7-c4775e45c45d" containerID="1f935517d08f0bfb3b722875899592e2857212ee9bb23e0e6bc6a3da7b99bdc2" exitCode=0 Mar 21 04:59:07 crc kubenswrapper[4580]: I0321 04:59:07.141348 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmvk5" event={"ID":"48b0cf7d-100c-4953-88d7-c4775e45c45d","Type":"ContainerDied","Data":"1f935517d08f0bfb3b722875899592e2857212ee9bb23e0e6bc6a3da7b99bdc2"} Mar 21 04:59:07 crc kubenswrapper[4580]: I0321 04:59:07.143749 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppl4t" event={"ID":"371f2b05-72f1-4289-b499-4490d84d0d38","Type":"ContainerStarted","Data":"06eb669ae7c035df753d11584a97b2d1bb92cb9b45263ebf731cd37ac12b2d4d"} Mar 21 04:59:07 crc kubenswrapper[4580]: I0321 04:59:07.150607 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbbtm" event={"ID":"fa2c2b42-95d7-47dd-b5c3-47ef8689c50c","Type":"ContainerStarted","Data":"2a49b9ff8ca97f38b8bcfe2b54c30e0466efc1e2b91c64c46ba4a483632f6dbe"} Mar 21 04:59:07 crc kubenswrapper[4580]: I0321 04:59:07.214930 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gbbtm" podStartSLOduration=3.790207436 podStartE2EDuration="6.214910768s" podCreationTimestamp="2026-03-21 04:59:01 +0000 UTC" firstStartedPulling="2026-03-21 04:59:04.088449129 +0000 UTC m=+449.171032757" lastFinishedPulling="2026-03-21 04:59:06.513152471 +0000 UTC m=+451.595736089" observedRunningTime="2026-03-21 04:59:07.211338379 +0000 UTC m=+452.293922027" watchObservedRunningTime="2026-03-21 04:59:07.214910768 +0000 UTC m=+452.297494396" Mar 21 04:59:08 crc kubenswrapper[4580]: I0321 04:59:08.157718 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmvk5" event={"ID":"48b0cf7d-100c-4953-88d7-c4775e45c45d","Type":"ContainerStarted","Data":"da37a6c9b5cb3fbc9c5d845a84817a2c3acac428c4618fe9ceac932cea893afb"} Mar 21 04:59:08 crc kubenswrapper[4580]: I0321 04:59:08.159812 4580 generic.go:334] "Generic (PLEG): container finished" podID="371f2b05-72f1-4289-b499-4490d84d0d38" containerID="06eb669ae7c035df753d11584a97b2d1bb92cb9b45263ebf731cd37ac12b2d4d" exitCode=0 Mar 21 04:59:08 crc kubenswrapper[4580]: I0321 04:59:08.159914 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppl4t" event={"ID":"371f2b05-72f1-4289-b499-4490d84d0d38","Type":"ContainerDied","Data":"06eb669ae7c035df753d11584a97b2d1bb92cb9b45263ebf731cd37ac12b2d4d"} Mar 21 04:59:08 crc kubenswrapper[4580]: I0321 04:59:08.159960 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppl4t" event={"ID":"371f2b05-72f1-4289-b499-4490d84d0d38","Type":"ContainerStarted","Data":"2e090d5154a0362b53be1bb3b2707bf0f8a83a48beebb5602d14a092186f2852"} Mar 21 04:59:08 crc kubenswrapper[4580]: I0321 04:59:08.182651 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dmvk5" podStartSLOduration=2.763079199 podStartE2EDuration="5.182632352s" podCreationTimestamp="2026-03-21 04:59:03 +0000 UTC" firstStartedPulling="2026-03-21 04:59:05.11649379 +0000 UTC m=+450.199077408" lastFinishedPulling="2026-03-21 04:59:07.536046933 +0000 UTC m=+452.618630561" observedRunningTime="2026-03-21 04:59:08.179095884 +0000 UTC m=+453.261679522" watchObservedRunningTime="2026-03-21 04:59:08.182632352 +0000 UTC m=+453.265215980" Mar 21 04:59:11 crc kubenswrapper[4580]: I0321 04:59:11.897232 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ggsnm" Mar 21 04:59:11 crc kubenswrapper[4580]: I0321 04:59:11.900814 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ggsnm" Mar 21 04:59:11 crc kubenswrapper[4580]: I0321 04:59:11.945383 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ggsnm" Mar 21 04:59:11 crc kubenswrapper[4580]: I0321 04:59:11.972409 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ppl4t" podStartSLOduration=6.487262803 podStartE2EDuration="7.972379233s" podCreationTimestamp="2026-03-21 04:59:04 +0000 UTC" firstStartedPulling="2026-03-21 04:59:06.123214405 +0000 UTC m=+451.205798033" lastFinishedPulling="2026-03-21 04:59:07.608330835 +0000 UTC m=+452.690914463" observedRunningTime="2026-03-21 04:59:08.203941722 +0000 UTC m=+453.286525350" watchObservedRunningTime="2026-03-21 04:59:11.972379233 +0000 UTC m=+457.054962861" Mar 21 04:59:12 crc kubenswrapper[4580]: I0321 04:59:12.079151 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gbbtm" Mar 21 04:59:12 crc kubenswrapper[4580]: I0321 04:59:12.079640 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gbbtm" Mar 21 04:59:12 crc kubenswrapper[4580]: I0321 04:59:12.123770 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gbbtm" Mar 21 04:59:12 crc kubenswrapper[4580]: I0321 04:59:12.230271 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gbbtm" Mar 21 04:59:12 crc kubenswrapper[4580]: I0321 04:59:12.233053 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ggsnm" Mar 21 04:59:14 crc kubenswrapper[4580]: I0321 04:59:14.291342 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dmvk5" Mar 21 04:59:14 crc kubenswrapper[4580]: I0321 04:59:14.291406 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dmvk5" Mar 21 04:59:14 crc kubenswrapper[4580]: I0321 04:59:14.355079 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dmvk5" Mar 21 04:59:14 crc kubenswrapper[4580]: I0321 04:59:14.506165 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ppl4t" Mar 21 04:59:14 crc kubenswrapper[4580]: I0321 04:59:14.506226 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ppl4t" Mar 21 04:59:14 crc kubenswrapper[4580]: I0321 04:59:14.550042 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ppl4t" Mar 21 04:59:15 crc kubenswrapper[4580]: I0321 04:59:15.241037 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ppl4t" Mar 21 04:59:15 crc kubenswrapper[4580]: I0321 04:59:15.259043 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dmvk5" Mar 21 04:59:16 crc kubenswrapper[4580]: I0321 04:59:16.211425 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7b296" Mar 21 04:59:16 crc kubenswrapper[4580]: I0321 04:59:16.280437 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bqkqg"] Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.341800 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" podUID="621054bf-a821-4811-b7b6-5b7d011b8a05" containerName="registry" containerID="cri-o://d04037254d9d5d58ed804eb3977d23d467cd29678ca601d731a677709c6edf5c" gracePeriod=30 Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.696546 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.818303 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-bound-sa-token\") pod \"621054bf-a821-4811-b7b6-5b7d011b8a05\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.818410 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/621054bf-a821-4811-b7b6-5b7d011b8a05-ca-trust-extracted\") pod \"621054bf-a821-4811-b7b6-5b7d011b8a05\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.818436 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/621054bf-a821-4811-b7b6-5b7d011b8a05-trusted-ca\") pod \"621054bf-a821-4811-b7b6-5b7d011b8a05\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.818485 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/621054bf-a821-4811-b7b6-5b7d011b8a05-installation-pull-secrets\") pod \"621054bf-a821-4811-b7b6-5b7d011b8a05\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.818564 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/621054bf-a821-4811-b7b6-5b7d011b8a05-registry-certificates\") pod \"621054bf-a821-4811-b7b6-5b7d011b8a05\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.818589 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nvfz\" (UniqueName: \"kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-kube-api-access-8nvfz\") pod \"621054bf-a821-4811-b7b6-5b7d011b8a05\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.818631 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-registry-tls\") pod \"621054bf-a821-4811-b7b6-5b7d011b8a05\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.818857 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"621054bf-a821-4811-b7b6-5b7d011b8a05\" (UID: \"621054bf-a821-4811-b7b6-5b7d011b8a05\") " Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.821166 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/621054bf-a821-4811-b7b6-5b7d011b8a05-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "621054bf-a821-4811-b7b6-5b7d011b8a05" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.821504 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/621054bf-a821-4811-b7b6-5b7d011b8a05-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "621054bf-a821-4811-b7b6-5b7d011b8a05" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.827442 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/621054bf-a821-4811-b7b6-5b7d011b8a05-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "621054bf-a821-4811-b7b6-5b7d011b8a05" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.829563 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-kube-api-access-8nvfz" (OuterVolumeSpecName: "kube-api-access-8nvfz") pod "621054bf-a821-4811-b7b6-5b7d011b8a05" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05"). InnerVolumeSpecName "kube-api-access-8nvfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.830265 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "621054bf-a821-4811-b7b6-5b7d011b8a05" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.831367 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "621054bf-a821-4811-b7b6-5b7d011b8a05" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.837140 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "621054bf-a821-4811-b7b6-5b7d011b8a05" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.840522 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/621054bf-a821-4811-b7b6-5b7d011b8a05-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "621054bf-a821-4811-b7b6-5b7d011b8a05" (UID: "621054bf-a821-4811-b7b6-5b7d011b8a05"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.920465 4580 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.920515 4580 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/621054bf-a821-4811-b7b6-5b7d011b8a05-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.920526 4580 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/621054bf-a821-4811-b7b6-5b7d011b8a05-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.920537 4580 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/621054bf-a821-4811-b7b6-5b7d011b8a05-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.920554 4580 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/621054bf-a821-4811-b7b6-5b7d011b8a05-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.920563 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nvfz\" (UniqueName: \"kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-kube-api-access-8nvfz\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:41 crc kubenswrapper[4580]: I0321 04:59:41.920573 4580 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/621054bf-a821-4811-b7b6-5b7d011b8a05-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:42 crc kubenswrapper[4580]: I0321 04:59:42.381180 4580 generic.go:334] "Generic (PLEG): container finished" podID="621054bf-a821-4811-b7b6-5b7d011b8a05" containerID="d04037254d9d5d58ed804eb3977d23d467cd29678ca601d731a677709c6edf5c" exitCode=0 Mar 21 04:59:42 crc kubenswrapper[4580]: I0321 04:59:42.381282 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" Mar 21 04:59:42 crc kubenswrapper[4580]: I0321 04:59:42.381260 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" event={"ID":"621054bf-a821-4811-b7b6-5b7d011b8a05","Type":"ContainerDied","Data":"d04037254d9d5d58ed804eb3977d23d467cd29678ca601d731a677709c6edf5c"} Mar 21 04:59:42 crc kubenswrapper[4580]: I0321 04:59:42.381948 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bqkqg" event={"ID":"621054bf-a821-4811-b7b6-5b7d011b8a05","Type":"ContainerDied","Data":"579090e3e5cc87a5994b081dd62b03085b7e0f0ca53e3e15e97ec0ca2ce43402"} Mar 21 04:59:42 crc kubenswrapper[4580]: I0321 04:59:42.381993 4580 scope.go:117] "RemoveContainer" containerID="d04037254d9d5d58ed804eb3977d23d467cd29678ca601d731a677709c6edf5c" Mar 21 04:59:42 crc kubenswrapper[4580]: I0321 04:59:42.410383 4580 scope.go:117] "RemoveContainer" containerID="d04037254d9d5d58ed804eb3977d23d467cd29678ca601d731a677709c6edf5c" Mar 21 04:59:42 crc kubenswrapper[4580]: E0321 04:59:42.411825 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d04037254d9d5d58ed804eb3977d23d467cd29678ca601d731a677709c6edf5c\": container with ID starting with d04037254d9d5d58ed804eb3977d23d467cd29678ca601d731a677709c6edf5c not found: ID does not exist" containerID="d04037254d9d5d58ed804eb3977d23d467cd29678ca601d731a677709c6edf5c" Mar 21 04:59:42 crc kubenswrapper[4580]: I0321 04:59:42.411940 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d04037254d9d5d58ed804eb3977d23d467cd29678ca601d731a677709c6edf5c"} err="failed to get container status \"d04037254d9d5d58ed804eb3977d23d467cd29678ca601d731a677709c6edf5c\": rpc error: code = NotFound desc = could not find container \"d04037254d9d5d58ed804eb3977d23d467cd29678ca601d731a677709c6edf5c\": container with ID starting with d04037254d9d5d58ed804eb3977d23d467cd29678ca601d731a677709c6edf5c not found: ID does not exist" Mar 21 04:59:42 crc kubenswrapper[4580]: I0321 04:59:42.419942 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bqkqg"] Mar 21 04:59:42 crc kubenswrapper[4580]: I0321 04:59:42.427114 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bqkqg"] Mar 21 04:59:43 crc kubenswrapper[4580]: I0321 04:59:43.631948 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="621054bf-a821-4811-b7b6-5b7d011b8a05" path="/var/lib/kubelet/pods/621054bf-a821-4811-b7b6-5b7d011b8a05/volumes" Mar 21 04:59:45 crc kubenswrapper[4580]: I0321 04:59:45.947618 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:59:45 crc kubenswrapper[4580]: I0321 04:59:45.947667 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.153110 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98"] Mar 21 05:00:00 crc kubenswrapper[4580]: E0321 05:00:00.154081 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621054bf-a821-4811-b7b6-5b7d011b8a05" containerName="registry" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.154098 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="621054bf-a821-4811-b7b6-5b7d011b8a05" containerName="registry" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.154211 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="621054bf-a821-4811-b7b6-5b7d011b8a05" containerName="registry" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.154713 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.157874 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.160433 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567820-r9xz9"] Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.161550 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-r9xz9" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.163987 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.164219 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.164892 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.168114 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.178838 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-r9xz9"] Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.194070 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98"] Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.301034 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89ff7df0-d49d-4787-bdaf-145ef7647123-secret-volume\") pod \"collect-profiles-29567820-t8d98\" (UID: \"89ff7df0-d49d-4787-bdaf-145ef7647123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.301112 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jdft\" (UniqueName: \"kubernetes.io/projected/89ff7df0-d49d-4787-bdaf-145ef7647123-kube-api-access-6jdft\") pod \"collect-profiles-29567820-t8d98\" (UID: \"89ff7df0-d49d-4787-bdaf-145ef7647123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.301154 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6f4z\" (UniqueName: \"kubernetes.io/projected/a9ad2cf4-f24a-42b2-a0ac-9793f82d3405-kube-api-access-j6f4z\") pod \"auto-csr-approver-29567820-r9xz9\" (UID: \"a9ad2cf4-f24a-42b2-a0ac-9793f82d3405\") " pod="openshift-infra/auto-csr-approver-29567820-r9xz9" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.301231 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89ff7df0-d49d-4787-bdaf-145ef7647123-config-volume\") pod \"collect-profiles-29567820-t8d98\" (UID: \"89ff7df0-d49d-4787-bdaf-145ef7647123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.402840 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jdft\" (UniqueName: \"kubernetes.io/projected/89ff7df0-d49d-4787-bdaf-145ef7647123-kube-api-access-6jdft\") pod \"collect-profiles-29567820-t8d98\" (UID: \"89ff7df0-d49d-4787-bdaf-145ef7647123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.402974 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6f4z\" (UniqueName: \"kubernetes.io/projected/a9ad2cf4-f24a-42b2-a0ac-9793f82d3405-kube-api-access-j6f4z\") pod \"auto-csr-approver-29567820-r9xz9\" (UID: \"a9ad2cf4-f24a-42b2-a0ac-9793f82d3405\") " pod="openshift-infra/auto-csr-approver-29567820-r9xz9" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.403049 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89ff7df0-d49d-4787-bdaf-145ef7647123-config-volume\") pod \"collect-profiles-29567820-t8d98\" (UID: \"89ff7df0-d49d-4787-bdaf-145ef7647123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.403127 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89ff7df0-d49d-4787-bdaf-145ef7647123-secret-volume\") pod \"collect-profiles-29567820-t8d98\" (UID: \"89ff7df0-d49d-4787-bdaf-145ef7647123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.404602 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89ff7df0-d49d-4787-bdaf-145ef7647123-config-volume\") pod \"collect-profiles-29567820-t8d98\" (UID: \"89ff7df0-d49d-4787-bdaf-145ef7647123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.414540 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89ff7df0-d49d-4787-bdaf-145ef7647123-secret-volume\") pod \"collect-profiles-29567820-t8d98\" (UID: \"89ff7df0-d49d-4787-bdaf-145ef7647123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.421754 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6f4z\" (UniqueName: \"kubernetes.io/projected/a9ad2cf4-f24a-42b2-a0ac-9793f82d3405-kube-api-access-j6f4z\") pod \"auto-csr-approver-29567820-r9xz9\" (UID: \"a9ad2cf4-f24a-42b2-a0ac-9793f82d3405\") " pod="openshift-infra/auto-csr-approver-29567820-r9xz9" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.422102 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jdft\" (UniqueName: \"kubernetes.io/projected/89ff7df0-d49d-4787-bdaf-145ef7647123-kube-api-access-6jdft\") pod \"collect-profiles-29567820-t8d98\" (UID: \"89ff7df0-d49d-4787-bdaf-145ef7647123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.486907 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.502406 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-r9xz9" Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.760069 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98"] Mar 21 05:00:00 crc kubenswrapper[4580]: I0321 05:00:00.812477 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-r9xz9"] Mar 21 05:00:01 crc kubenswrapper[4580]: I0321 05:00:01.528009 4580 generic.go:334] "Generic (PLEG): container finished" podID="89ff7df0-d49d-4787-bdaf-145ef7647123" containerID="05ae03480dc5f9c4472df88544f2498e8c3906b54ee45ff1be0d994d971c4017" exitCode=0 Mar 21 05:00:01 crc kubenswrapper[4580]: I0321 05:00:01.528142 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" event={"ID":"89ff7df0-d49d-4787-bdaf-145ef7647123","Type":"ContainerDied","Data":"05ae03480dc5f9c4472df88544f2498e8c3906b54ee45ff1be0d994d971c4017"} Mar 21 05:00:01 crc kubenswrapper[4580]: I0321 05:00:01.528538 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" event={"ID":"89ff7df0-d49d-4787-bdaf-145ef7647123","Type":"ContainerStarted","Data":"c7bbdd3579ae8daf26b70ebc5b70985f7900cd95b1341381e6266f834999939a"} Mar 21 05:00:01 crc kubenswrapper[4580]: I0321 05:00:01.529911 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567820-r9xz9" event={"ID":"a9ad2cf4-f24a-42b2-a0ac-9793f82d3405","Type":"ContainerStarted","Data":"9ebbe919d0eee8a7b2afbb477584eeb3677b8b1cca167a80af169efd599cc78f"} Mar 21 05:00:02 crc kubenswrapper[4580]: I0321 05:00:02.539537 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567820-r9xz9" event={"ID":"a9ad2cf4-f24a-42b2-a0ac-9793f82d3405","Type":"ContainerStarted","Data":"3d82b86ba408bf8ebe802a0ff317fb66ac3dc6f003327d8ca90a9d47db056de8"} Mar 21 05:00:02 crc kubenswrapper[4580]: I0321 05:00:02.557720 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567820-r9xz9" podStartSLOduration=1.186202253 podStartE2EDuration="2.557689371s" podCreationTimestamp="2026-03-21 05:00:00 +0000 UTC" firstStartedPulling="2026-03-21 05:00:00.824172975 +0000 UTC m=+505.906756603" lastFinishedPulling="2026-03-21 05:00:02.195660093 +0000 UTC m=+507.278243721" observedRunningTime="2026-03-21 05:00:02.555900221 +0000 UTC m=+507.638483869" watchObservedRunningTime="2026-03-21 05:00:02.557689371 +0000 UTC m=+507.640272999" Mar 21 05:00:02 crc kubenswrapper[4580]: I0321 05:00:02.801717 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" Mar 21 05:00:02 crc kubenswrapper[4580]: I0321 05:00:02.944026 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89ff7df0-d49d-4787-bdaf-145ef7647123-config-volume\") pod \"89ff7df0-d49d-4787-bdaf-145ef7647123\" (UID: \"89ff7df0-d49d-4787-bdaf-145ef7647123\") " Mar 21 05:00:02 crc kubenswrapper[4580]: I0321 05:00:02.944147 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jdft\" (UniqueName: \"kubernetes.io/projected/89ff7df0-d49d-4787-bdaf-145ef7647123-kube-api-access-6jdft\") pod \"89ff7df0-d49d-4787-bdaf-145ef7647123\" (UID: \"89ff7df0-d49d-4787-bdaf-145ef7647123\") " Mar 21 05:00:02 crc kubenswrapper[4580]: I0321 05:00:02.944257 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89ff7df0-d49d-4787-bdaf-145ef7647123-secret-volume\") pod \"89ff7df0-d49d-4787-bdaf-145ef7647123\" (UID: \"89ff7df0-d49d-4787-bdaf-145ef7647123\") " Mar 21 05:00:02 crc kubenswrapper[4580]: I0321 05:00:02.945063 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89ff7df0-d49d-4787-bdaf-145ef7647123-config-volume" (OuterVolumeSpecName: "config-volume") pod "89ff7df0-d49d-4787-bdaf-145ef7647123" (UID: "89ff7df0-d49d-4787-bdaf-145ef7647123"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4580]: I0321 05:00:02.951704 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ff7df0-d49d-4787-bdaf-145ef7647123-kube-api-access-6jdft" (OuterVolumeSpecName: "kube-api-access-6jdft") pod "89ff7df0-d49d-4787-bdaf-145ef7647123" (UID: "89ff7df0-d49d-4787-bdaf-145ef7647123"). InnerVolumeSpecName "kube-api-access-6jdft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4580]: I0321 05:00:02.951995 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ff7df0-d49d-4787-bdaf-145ef7647123-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "89ff7df0-d49d-4787-bdaf-145ef7647123" (UID: "89ff7df0-d49d-4787-bdaf-145ef7647123"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:00:03 crc kubenswrapper[4580]: I0321 05:00:03.046316 4580 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89ff7df0-d49d-4787-bdaf-145ef7647123-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:03 crc kubenswrapper[4580]: I0321 05:00:03.046361 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jdft\" (UniqueName: \"kubernetes.io/projected/89ff7df0-d49d-4787-bdaf-145ef7647123-kube-api-access-6jdft\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:03 crc kubenswrapper[4580]: I0321 05:00:03.046385 4580 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89ff7df0-d49d-4787-bdaf-145ef7647123-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:03 crc kubenswrapper[4580]: I0321 05:00:03.547095 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" event={"ID":"89ff7df0-d49d-4787-bdaf-145ef7647123","Type":"ContainerDied","Data":"c7bbdd3579ae8daf26b70ebc5b70985f7900cd95b1341381e6266f834999939a"} Mar 21 05:00:03 crc kubenswrapper[4580]: I0321 05:00:03.547159 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7bbdd3579ae8daf26b70ebc5b70985f7900cd95b1341381e6266f834999939a" Mar 21 05:00:03 crc kubenswrapper[4580]: I0321 05:00:03.547259 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98" Mar 21 05:00:03 crc kubenswrapper[4580]: I0321 05:00:03.550016 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9ad2cf4-f24a-42b2-a0ac-9793f82d3405" containerID="3d82b86ba408bf8ebe802a0ff317fb66ac3dc6f003327d8ca90a9d47db056de8" exitCode=0 Mar 21 05:00:03 crc kubenswrapper[4580]: I0321 05:00:03.550049 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567820-r9xz9" event={"ID":"a9ad2cf4-f24a-42b2-a0ac-9793f82d3405","Type":"ContainerDied","Data":"3d82b86ba408bf8ebe802a0ff317fb66ac3dc6f003327d8ca90a9d47db056de8"} Mar 21 05:00:04 crc kubenswrapper[4580]: I0321 05:00:04.799496 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-r9xz9" Mar 21 05:00:04 crc kubenswrapper[4580]: I0321 05:00:04.909721 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6f4z\" (UniqueName: \"kubernetes.io/projected/a9ad2cf4-f24a-42b2-a0ac-9793f82d3405-kube-api-access-j6f4z\") pod \"a9ad2cf4-f24a-42b2-a0ac-9793f82d3405\" (UID: \"a9ad2cf4-f24a-42b2-a0ac-9793f82d3405\") " Mar 21 05:00:04 crc kubenswrapper[4580]: I0321 05:00:04.939595 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ad2cf4-f24a-42b2-a0ac-9793f82d3405-kube-api-access-j6f4z" (OuterVolumeSpecName: "kube-api-access-j6f4z") pod "a9ad2cf4-f24a-42b2-a0ac-9793f82d3405" (UID: "a9ad2cf4-f24a-42b2-a0ac-9793f82d3405"). InnerVolumeSpecName "kube-api-access-j6f4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:00:05 crc kubenswrapper[4580]: I0321 05:00:05.011534 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6f4z\" (UniqueName: \"kubernetes.io/projected/a9ad2cf4-f24a-42b2-a0ac-9793f82d3405-kube-api-access-j6f4z\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:05 crc kubenswrapper[4580]: I0321 05:00:05.565978 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567820-r9xz9" event={"ID":"a9ad2cf4-f24a-42b2-a0ac-9793f82d3405","Type":"ContainerDied","Data":"9ebbe919d0eee8a7b2afbb477584eeb3677b8b1cca167a80af169efd599cc78f"} Mar 21 05:00:05 crc kubenswrapper[4580]: I0321 05:00:05.566038 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ebbe919d0eee8a7b2afbb477584eeb3677b8b1cca167a80af169efd599cc78f" Mar 21 05:00:05 crc kubenswrapper[4580]: I0321 05:00:05.566118 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-r9xz9" Mar 21 05:00:05 crc kubenswrapper[4580]: I0321 05:00:05.630993 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-8cxbg"] Mar 21 05:00:05 crc kubenswrapper[4580]: I0321 05:00:05.631064 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-8cxbg"] Mar 21 05:00:07 crc kubenswrapper[4580]: I0321 05:00:07.635897 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1714688f-61d5-436b-baaf-2668757942fd" path="/var/lib/kubelet/pods/1714688f-61d5-436b-baaf-2668757942fd/volumes" Mar 21 05:00:15 crc kubenswrapper[4580]: I0321 05:00:15.948056 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:00:15 crc kubenswrapper[4580]: I0321 05:00:15.949070 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:00:45 crc kubenswrapper[4580]: I0321 05:00:45.948202 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:00:45 crc kubenswrapper[4580]: I0321 05:00:45.949198 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:00:45 crc kubenswrapper[4580]: I0321 05:00:45.949275 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 05:00:45 crc kubenswrapper[4580]: I0321 05:00:45.950207 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c2d2ca0ada0ea0c349eeeb34497c0724179464955a36c9a80f234b23a820b94"} pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:00:45 crc kubenswrapper[4580]: I0321 05:00:45.950276 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" containerID="cri-o://4c2d2ca0ada0ea0c349eeeb34497c0724179464955a36c9a80f234b23a820b94" gracePeriod=600 Mar 21 05:00:46 crc kubenswrapper[4580]: I0321 05:00:46.302660 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerID="4c2d2ca0ada0ea0c349eeeb34497c0724179464955a36c9a80f234b23a820b94" exitCode=0 Mar 21 05:00:46 crc kubenswrapper[4580]: I0321 05:00:46.302857 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerDied","Data":"4c2d2ca0ada0ea0c349eeeb34497c0724179464955a36c9a80f234b23a820b94"} Mar 21 05:00:46 crc kubenswrapper[4580]: I0321 05:00:46.303585 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"c0eb8b838e32b2cb84a7017035087c480a43ddf5f25d7a157720d87f4ca4f069"} Mar 21 05:00:46 crc kubenswrapper[4580]: I0321 05:00:46.303665 4580 scope.go:117] "RemoveContainer" containerID="cffd042aebabf50815323525cd091797cb8bdec334684d1c8bdb56f49b15baed" Mar 21 05:02:00 crc kubenswrapper[4580]: I0321 05:02:00.159913 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567822-t4ldw"] Mar 21 05:02:00 crc kubenswrapper[4580]: E0321 05:02:00.161031 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ff7df0-d49d-4787-bdaf-145ef7647123" containerName="collect-profiles" Mar 21 05:02:00 crc kubenswrapper[4580]: I0321 05:02:00.161055 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ff7df0-d49d-4787-bdaf-145ef7647123" containerName="collect-profiles" Mar 21 05:02:00 crc kubenswrapper[4580]: E0321 05:02:00.161079 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ad2cf4-f24a-42b2-a0ac-9793f82d3405" containerName="oc" Mar 21 05:02:00 crc kubenswrapper[4580]: I0321 05:02:00.161089 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ad2cf4-f24a-42b2-a0ac-9793f82d3405" containerName="oc" Mar 21 05:02:00 crc kubenswrapper[4580]: I0321 05:02:00.161255 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ad2cf4-f24a-42b2-a0ac-9793f82d3405" containerName="oc" Mar 21 05:02:00 crc kubenswrapper[4580]: I0321 05:02:00.161279 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="89ff7df0-d49d-4787-bdaf-145ef7647123" containerName="collect-profiles" Mar 21 05:02:00 crc kubenswrapper[4580]: I0321 05:02:00.161900 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-t4ldw" Mar 21 05:02:00 crc kubenswrapper[4580]: I0321 05:02:00.167763 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:02:00 crc kubenswrapper[4580]: I0321 05:02:00.168042 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:02:00 crc kubenswrapper[4580]: I0321 05:02:00.168171 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:02:00 crc kubenswrapper[4580]: I0321 05:02:00.174430 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-t4ldw"] Mar 21 05:02:00 crc kubenswrapper[4580]: I0321 05:02:00.361330 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k75k\" (UniqueName: \"kubernetes.io/projected/f774fe43-2dcb-4e22-b54a-db3e1b969706-kube-api-access-5k75k\") pod \"auto-csr-approver-29567822-t4ldw\" (UID: \"f774fe43-2dcb-4e22-b54a-db3e1b969706\") " pod="openshift-infra/auto-csr-approver-29567822-t4ldw" Mar 21 05:02:00 crc kubenswrapper[4580]: I0321 05:02:00.462928 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k75k\" (UniqueName: \"kubernetes.io/projected/f774fe43-2dcb-4e22-b54a-db3e1b969706-kube-api-access-5k75k\") pod \"auto-csr-approver-29567822-t4ldw\" (UID: \"f774fe43-2dcb-4e22-b54a-db3e1b969706\") " pod="openshift-infra/auto-csr-approver-29567822-t4ldw" Mar 21 05:02:00 crc kubenswrapper[4580]: I0321 05:02:00.484107 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k75k\" (UniqueName: \"kubernetes.io/projected/f774fe43-2dcb-4e22-b54a-db3e1b969706-kube-api-access-5k75k\") pod \"auto-csr-approver-29567822-t4ldw\" (UID: \"f774fe43-2dcb-4e22-b54a-db3e1b969706\") " pod="openshift-infra/auto-csr-approver-29567822-t4ldw" Mar 21 05:02:00 crc kubenswrapper[4580]: I0321 05:02:00.784595 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-t4ldw" Mar 21 05:02:00 crc kubenswrapper[4580]: I0321 05:02:00.995101 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-t4ldw"] Mar 21 05:02:01 crc kubenswrapper[4580]: I0321 05:02:01.005563 4580 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:02:01 crc kubenswrapper[4580]: I0321 05:02:01.789388 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567822-t4ldw" event={"ID":"f774fe43-2dcb-4e22-b54a-db3e1b969706","Type":"ContainerStarted","Data":"c77c4617b4f9d21b275aeb355ba14b34f506218fdf7cec592fb01eea72b68582"} Mar 21 05:02:02 crc kubenswrapper[4580]: I0321 05:02:02.798471 4580 generic.go:334] "Generic (PLEG): container finished" podID="f774fe43-2dcb-4e22-b54a-db3e1b969706" containerID="4a5bab3815b9de316e26b637338530acea669073a0d814e9ade3daa248c0aacb" exitCode=0 Mar 21 05:02:02 crc kubenswrapper[4580]: I0321 05:02:02.798652 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567822-t4ldw" event={"ID":"f774fe43-2dcb-4e22-b54a-db3e1b969706","Type":"ContainerDied","Data":"4a5bab3815b9de316e26b637338530acea669073a0d814e9ade3daa248c0aacb"} Mar 21 05:02:04 crc kubenswrapper[4580]: I0321 05:02:04.003631 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-t4ldw" Mar 21 05:02:04 crc kubenswrapper[4580]: I0321 05:02:04.119439 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k75k\" (UniqueName: \"kubernetes.io/projected/f774fe43-2dcb-4e22-b54a-db3e1b969706-kube-api-access-5k75k\") pod \"f774fe43-2dcb-4e22-b54a-db3e1b969706\" (UID: \"f774fe43-2dcb-4e22-b54a-db3e1b969706\") " Mar 21 05:02:04 crc kubenswrapper[4580]: I0321 05:02:04.126810 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f774fe43-2dcb-4e22-b54a-db3e1b969706-kube-api-access-5k75k" (OuterVolumeSpecName: "kube-api-access-5k75k") pod "f774fe43-2dcb-4e22-b54a-db3e1b969706" (UID: "f774fe43-2dcb-4e22-b54a-db3e1b969706"). InnerVolumeSpecName "kube-api-access-5k75k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:02:04 crc kubenswrapper[4580]: I0321 05:02:04.220709 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k75k\" (UniqueName: \"kubernetes.io/projected/f774fe43-2dcb-4e22-b54a-db3e1b969706-kube-api-access-5k75k\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:04 crc kubenswrapper[4580]: I0321 05:02:04.814032 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567822-t4ldw" event={"ID":"f774fe43-2dcb-4e22-b54a-db3e1b969706","Type":"ContainerDied","Data":"c77c4617b4f9d21b275aeb355ba14b34f506218fdf7cec592fb01eea72b68582"} Mar 21 05:02:04 crc kubenswrapper[4580]: I0321 05:02:04.814091 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c77c4617b4f9d21b275aeb355ba14b34f506218fdf7cec592fb01eea72b68582" Mar 21 05:02:04 crc kubenswrapper[4580]: I0321 05:02:04.814123 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-t4ldw" Mar 21 05:02:05 crc kubenswrapper[4580]: I0321 05:02:05.073990 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-m2qj9"] Mar 21 05:02:05 crc kubenswrapper[4580]: I0321 05:02:05.077936 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-m2qj9"] Mar 21 05:02:05 crc kubenswrapper[4580]: I0321 05:02:05.626072 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cb67fb0-cbe3-47cd-9029-f54e6e74729d" path="/var/lib/kubelet/pods/0cb67fb0-cbe3-47cd-9029-f54e6e74729d/volumes" Mar 21 05:03:15 crc kubenswrapper[4580]: I0321 05:03:15.948465 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:03:15 crc kubenswrapper[4580]: I0321 05:03:15.949379 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:03:45 crc kubenswrapper[4580]: I0321 05:03:45.947488 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:03:45 crc kubenswrapper[4580]: I0321 05:03:45.949237 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:03:51 crc kubenswrapper[4580]: I0321 05:03:51.669325 4580 scope.go:117] "RemoveContainer" containerID="891881ebecb575e921905eef7c5cc09306edba571f5bd5dd4b455d143e81690a" Mar 21 05:03:51 crc kubenswrapper[4580]: I0321 05:03:51.710180 4580 scope.go:117] "RemoveContainer" containerID="ec455bb30fbb2c325792f041e355d5d7a4ca5f20706914886a644ffc606763cf" Mar 21 05:04:00 crc kubenswrapper[4580]: I0321 05:04:00.141391 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567824-52mdt"] Mar 21 05:04:00 crc kubenswrapper[4580]: E0321 05:04:00.142201 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f774fe43-2dcb-4e22-b54a-db3e1b969706" containerName="oc" Mar 21 05:04:00 crc kubenswrapper[4580]: I0321 05:04:00.142219 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f774fe43-2dcb-4e22-b54a-db3e1b969706" containerName="oc" Mar 21 05:04:00 crc kubenswrapper[4580]: I0321 05:04:00.142382 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f774fe43-2dcb-4e22-b54a-db3e1b969706" containerName="oc" Mar 21 05:04:00 crc kubenswrapper[4580]: I0321 05:04:00.142976 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-52mdt" Mar 21 05:04:00 crc kubenswrapper[4580]: I0321 05:04:00.145486 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:04:00 crc kubenswrapper[4580]: I0321 05:04:00.145938 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:04:00 crc kubenswrapper[4580]: I0321 05:04:00.151259 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:04:00 crc kubenswrapper[4580]: I0321 05:04:00.155536 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-52mdt"] Mar 21 05:04:00 crc kubenswrapper[4580]: I0321 05:04:00.159204 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt68n\" (UniqueName: \"kubernetes.io/projected/a7c7def8-ae32-4d0f-93db-749023ea9f17-kube-api-access-lt68n\") pod \"auto-csr-approver-29567824-52mdt\" (UID: \"a7c7def8-ae32-4d0f-93db-749023ea9f17\") " pod="openshift-infra/auto-csr-approver-29567824-52mdt" Mar 21 05:04:00 crc kubenswrapper[4580]: I0321 05:04:00.260519 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt68n\" (UniqueName: \"kubernetes.io/projected/a7c7def8-ae32-4d0f-93db-749023ea9f17-kube-api-access-lt68n\") pod \"auto-csr-approver-29567824-52mdt\" (UID: \"a7c7def8-ae32-4d0f-93db-749023ea9f17\") " pod="openshift-infra/auto-csr-approver-29567824-52mdt" Mar 21 05:04:00 crc kubenswrapper[4580]: I0321 05:04:00.289469 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt68n\" (UniqueName: \"kubernetes.io/projected/a7c7def8-ae32-4d0f-93db-749023ea9f17-kube-api-access-lt68n\") pod \"auto-csr-approver-29567824-52mdt\" (UID: \"a7c7def8-ae32-4d0f-93db-749023ea9f17\") " pod="openshift-infra/auto-csr-approver-29567824-52mdt" Mar 21 05:04:00 crc kubenswrapper[4580]: I0321 05:04:00.464439 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-52mdt" Mar 21 05:04:00 crc kubenswrapper[4580]: I0321 05:04:00.649445 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-52mdt"] Mar 21 05:04:01 crc kubenswrapper[4580]: I0321 05:04:01.579892 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567824-52mdt" event={"ID":"a7c7def8-ae32-4d0f-93db-749023ea9f17","Type":"ContainerStarted","Data":"9b77f425742711d734bfaa7d8b34a666352960d45895b77e87920e08b904e088"} Mar 21 05:04:02 crc kubenswrapper[4580]: I0321 05:04:02.588550 4580 generic.go:334] "Generic (PLEG): container finished" podID="a7c7def8-ae32-4d0f-93db-749023ea9f17" containerID="ceb20283c2e8e490de1e43528dc0099be0fba578cd24941750b1143b6d33fb17" exitCode=0 Mar 21 05:04:02 crc kubenswrapper[4580]: I0321 05:04:02.588644 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567824-52mdt" event={"ID":"a7c7def8-ae32-4d0f-93db-749023ea9f17","Type":"ContainerDied","Data":"ceb20283c2e8e490de1e43528dc0099be0fba578cd24941750b1143b6d33fb17"} Mar 21 05:04:03 crc kubenswrapper[4580]: I0321 05:04:03.821562 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-52mdt" Mar 21 05:04:04 crc kubenswrapper[4580]: I0321 05:04:04.017279 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt68n\" (UniqueName: \"kubernetes.io/projected/a7c7def8-ae32-4d0f-93db-749023ea9f17-kube-api-access-lt68n\") pod \"a7c7def8-ae32-4d0f-93db-749023ea9f17\" (UID: \"a7c7def8-ae32-4d0f-93db-749023ea9f17\") " Mar 21 05:04:04 crc kubenswrapper[4580]: I0321 05:04:04.023050 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7c7def8-ae32-4d0f-93db-749023ea9f17-kube-api-access-lt68n" (OuterVolumeSpecName: "kube-api-access-lt68n") pod "a7c7def8-ae32-4d0f-93db-749023ea9f17" (UID: "a7c7def8-ae32-4d0f-93db-749023ea9f17"). InnerVolumeSpecName "kube-api-access-lt68n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:04:04 crc kubenswrapper[4580]: I0321 05:04:04.118816 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt68n\" (UniqueName: \"kubernetes.io/projected/a7c7def8-ae32-4d0f-93db-749023ea9f17-kube-api-access-lt68n\") on node \"crc\" DevicePath \"\"" Mar 21 05:04:04 crc kubenswrapper[4580]: I0321 05:04:04.603387 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567824-52mdt" event={"ID":"a7c7def8-ae32-4d0f-93db-749023ea9f17","Type":"ContainerDied","Data":"9b77f425742711d734bfaa7d8b34a666352960d45895b77e87920e08b904e088"} Mar 21 05:04:04 crc kubenswrapper[4580]: I0321 05:04:04.603433 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b77f425742711d734bfaa7d8b34a666352960d45895b77e87920e08b904e088" Mar 21 05:04:04 crc kubenswrapper[4580]: I0321 05:04:04.603447 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-52mdt" Mar 21 05:04:04 crc kubenswrapper[4580]: I0321 05:04:04.885482 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-rvw2x"] Mar 21 05:04:04 crc kubenswrapper[4580]: I0321 05:04:04.889627 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-rvw2x"] Mar 21 05:04:05 crc kubenswrapper[4580]: I0321 05:04:05.625121 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d3189c3-2741-4a7b-9307-2368ec483cf9" path="/var/lib/kubelet/pods/7d3189c3-2741-4a7b-9307-2368ec483cf9/volumes" Mar 21 05:04:15 crc kubenswrapper[4580]: I0321 05:04:15.947984 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:04:15 crc kubenswrapper[4580]: I0321 05:04:15.948598 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:04:15 crc kubenswrapper[4580]: I0321 05:04:15.948651 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 05:04:15 crc kubenswrapper[4580]: I0321 05:04:15.949307 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c0eb8b838e32b2cb84a7017035087c480a43ddf5f25d7a157720d87f4ca4f069"} pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:04:15 crc kubenswrapper[4580]: I0321 05:04:15.949372 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" containerID="cri-o://c0eb8b838e32b2cb84a7017035087c480a43ddf5f25d7a157720d87f4ca4f069" gracePeriod=600 Mar 21 05:04:16 crc kubenswrapper[4580]: I0321 05:04:16.679702 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerID="c0eb8b838e32b2cb84a7017035087c480a43ddf5f25d7a157720d87f4ca4f069" exitCode=0 Mar 21 05:04:16 crc kubenswrapper[4580]: I0321 05:04:16.679963 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerDied","Data":"c0eb8b838e32b2cb84a7017035087c480a43ddf5f25d7a157720d87f4ca4f069"} Mar 21 05:04:16 crc kubenswrapper[4580]: I0321 05:04:16.680256 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"76a03eb87bee439fb7189493fe11b7778fb36a6c538f9c47967069b07415ab8b"} Mar 21 05:04:16 crc kubenswrapper[4580]: I0321 05:04:16.680283 4580 scope.go:117] "RemoveContainer" containerID="4c2d2ca0ada0ea0c349eeeb34497c0724179464955a36c9a80f234b23a820b94" Mar 21 05:04:51 crc kubenswrapper[4580]: I0321 05:04:51.778764 4580 scope.go:117] "RemoveContainer" containerID="92a4c095e7a4ade2273e23458e8688196a1c5116405e14261dc8ec3c9af2d3a2" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.494456 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-plhxw"] Mar 21 05:04:58 crc kubenswrapper[4580]: E0321 05:04:58.495666 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c7def8-ae32-4d0f-93db-749023ea9f17" containerName="oc" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.495687 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c7def8-ae32-4d0f-93db-749023ea9f17" containerName="oc" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.495835 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7c7def8-ae32-4d0f-93db-749023ea9f17" containerName="oc" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.496393 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-plhxw" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.501461 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.501535 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.501731 4580 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-mmxlw" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.503750 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-ssv6b"] Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.504813 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-ssv6b" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.506259 4580 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-nqwzt" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.519556 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-ssv6b"] Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.583644 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-plhxw"] Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.586218 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-kcmc2"] Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.588002 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-kcmc2" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.591838 4580 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-wr7hp" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.611334 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-kcmc2"] Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.669639 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4w28\" (UniqueName: \"kubernetes.io/projected/cc874172-d2d5-4811-8b52-8822da8cb97f-kube-api-access-q4w28\") pod \"cert-manager-858654f9db-ssv6b\" (UID: \"cc874172-d2d5-4811-8b52-8822da8cb97f\") " pod="cert-manager/cert-manager-858654f9db-ssv6b" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.669717 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzsv9\" (UniqueName: \"kubernetes.io/projected/cdd054c6-1468-4d34-866c-612b69c7bb4f-kube-api-access-dzsv9\") pod \"cert-manager-cainjector-cf98fcc89-plhxw\" (UID: \"cdd054c6-1468-4d34-866c-612b69c7bb4f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-plhxw" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.771988 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcqwv\" (UniqueName: \"kubernetes.io/projected/7154e53a-0974-4463-9d9a-20cea09f0e94-kube-api-access-mcqwv\") pod \"cert-manager-webhook-687f57d79b-kcmc2\" (UID: \"7154e53a-0974-4463-9d9a-20cea09f0e94\") " pod="cert-manager/cert-manager-webhook-687f57d79b-kcmc2" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.772195 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4w28\" (UniqueName: \"kubernetes.io/projected/cc874172-d2d5-4811-8b52-8822da8cb97f-kube-api-access-q4w28\") pod \"cert-manager-858654f9db-ssv6b\" (UID: \"cc874172-d2d5-4811-8b52-8822da8cb97f\") " pod="cert-manager/cert-manager-858654f9db-ssv6b" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.772259 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzsv9\" (UniqueName: \"kubernetes.io/projected/cdd054c6-1468-4d34-866c-612b69c7bb4f-kube-api-access-dzsv9\") pod \"cert-manager-cainjector-cf98fcc89-plhxw\" (UID: \"cdd054c6-1468-4d34-866c-612b69c7bb4f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-plhxw" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.797772 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4w28\" (UniqueName: \"kubernetes.io/projected/cc874172-d2d5-4811-8b52-8822da8cb97f-kube-api-access-q4w28\") pod \"cert-manager-858654f9db-ssv6b\" (UID: \"cc874172-d2d5-4811-8b52-8822da8cb97f\") " pod="cert-manager/cert-manager-858654f9db-ssv6b" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.799866 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzsv9\" (UniqueName: \"kubernetes.io/projected/cdd054c6-1468-4d34-866c-612b69c7bb4f-kube-api-access-dzsv9\") pod \"cert-manager-cainjector-cf98fcc89-plhxw\" (UID: \"cdd054c6-1468-4d34-866c-612b69c7bb4f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-plhxw" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.816948 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-plhxw" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.828461 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-ssv6b" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.872955 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcqwv\" (UniqueName: \"kubernetes.io/projected/7154e53a-0974-4463-9d9a-20cea09f0e94-kube-api-access-mcqwv\") pod \"cert-manager-webhook-687f57d79b-kcmc2\" (UID: \"7154e53a-0974-4463-9d9a-20cea09f0e94\") " pod="cert-manager/cert-manager-webhook-687f57d79b-kcmc2" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.894427 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcqwv\" (UniqueName: \"kubernetes.io/projected/7154e53a-0974-4463-9d9a-20cea09f0e94-kube-api-access-mcqwv\") pod \"cert-manager-webhook-687f57d79b-kcmc2\" (UID: \"7154e53a-0974-4463-9d9a-20cea09f0e94\") " pod="cert-manager/cert-manager-webhook-687f57d79b-kcmc2" Mar 21 05:04:58 crc kubenswrapper[4580]: I0321 05:04:58.917728 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-kcmc2" Mar 21 05:04:59 crc kubenswrapper[4580]: I0321 05:04:59.070773 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-ssv6b"] Mar 21 05:04:59 crc kubenswrapper[4580]: I0321 05:04:59.112984 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-plhxw"] Mar 21 05:04:59 crc kubenswrapper[4580]: W0321 05:04:59.116421 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdd054c6_1468_4d34_866c_612b69c7bb4f.slice/crio-da952a3ea2dac1300aa408f586ed86843cf0acfe071087d3686bbeaff7274227 WatchSource:0}: Error finding container da952a3ea2dac1300aa408f586ed86843cf0acfe071087d3686bbeaff7274227: Status 404 returned error can't find the container with id da952a3ea2dac1300aa408f586ed86843cf0acfe071087d3686bbeaff7274227 Mar 21 05:04:59 crc kubenswrapper[4580]: I0321 05:04:59.194337 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-kcmc2"] Mar 21 05:04:59 crc kubenswrapper[4580]: W0321 05:04:59.200775 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7154e53a_0974_4463_9d9a_20cea09f0e94.slice/crio-54533cead56804144ee1269ba24f02223b10b50a5c277f2b74fb46033cc510e5 WatchSource:0}: Error finding container 54533cead56804144ee1269ba24f02223b10b50a5c277f2b74fb46033cc510e5: Status 404 returned error can't find the container with id 54533cead56804144ee1269ba24f02223b10b50a5c277f2b74fb46033cc510e5 Mar 21 05:04:59 crc kubenswrapper[4580]: I0321 05:04:59.992332 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-kcmc2" event={"ID":"7154e53a-0974-4463-9d9a-20cea09f0e94","Type":"ContainerStarted","Data":"54533cead56804144ee1269ba24f02223b10b50a5c277f2b74fb46033cc510e5"} Mar 21 05:04:59 crc kubenswrapper[4580]: I0321 05:04:59.993212 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-ssv6b" event={"ID":"cc874172-d2d5-4811-8b52-8822da8cb97f","Type":"ContainerStarted","Data":"5af430ce7bac72d3661f6e12de8dbe9625ccccb965aa5eaf54fe027a28c9ff3a"} Mar 21 05:04:59 crc kubenswrapper[4580]: I0321 05:04:59.995583 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-plhxw" event={"ID":"cdd054c6-1468-4d34-866c-612b69c7bb4f","Type":"ContainerStarted","Data":"da952a3ea2dac1300aa408f586ed86843cf0acfe071087d3686bbeaff7274227"} Mar 21 05:05:03 crc kubenswrapper[4580]: I0321 05:05:03.014090 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-plhxw" event={"ID":"cdd054c6-1468-4d34-866c-612b69c7bb4f","Type":"ContainerStarted","Data":"8f3b606499204632cd0e8965e1a6474d0a696907eb45402ef081738984939f83"} Mar 21 05:05:03 crc kubenswrapper[4580]: I0321 05:05:03.018013 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-kcmc2" event={"ID":"7154e53a-0974-4463-9d9a-20cea09f0e94","Type":"ContainerStarted","Data":"2597df2f300637876deca02ec9626c84bd9fcdea725254fe64dae8e563bf5076"} Mar 21 05:05:03 crc kubenswrapper[4580]: I0321 05:05:03.018191 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-kcmc2" Mar 21 05:05:03 crc kubenswrapper[4580]: I0321 05:05:03.023375 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-ssv6b" event={"ID":"cc874172-d2d5-4811-8b52-8822da8cb97f","Type":"ContainerStarted","Data":"db6433d1c890fea26126536709aaa1391d819225db5a243b33e506c16edbd07d"} Mar 21 05:05:03 crc kubenswrapper[4580]: I0321 05:05:03.054573 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-plhxw" podStartSLOduration=1.56961473 podStartE2EDuration="5.054549982s" podCreationTimestamp="2026-03-21 05:04:58 +0000 UTC" firstStartedPulling="2026-03-21 05:04:59.120132243 +0000 UTC m=+804.202715871" lastFinishedPulling="2026-03-21 05:05:02.605067495 +0000 UTC m=+807.687651123" observedRunningTime="2026-03-21 05:05:03.046498136 +0000 UTC m=+808.129081794" watchObservedRunningTime="2026-03-21 05:05:03.054549982 +0000 UTC m=+808.137133610" Mar 21 05:05:03 crc kubenswrapper[4580]: I0321 05:05:03.098991 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-ssv6b" podStartSLOduration=1.57698785 podStartE2EDuration="5.098966116s" podCreationTimestamp="2026-03-21 05:04:58 +0000 UTC" firstStartedPulling="2026-03-21 05:04:59.084078115 +0000 UTC m=+804.166661743" lastFinishedPulling="2026-03-21 05:05:02.606056381 +0000 UTC m=+807.688640009" observedRunningTime="2026-03-21 05:05:03.083055188 +0000 UTC m=+808.165638816" watchObservedRunningTime="2026-03-21 05:05:03.098966116 +0000 UTC m=+808.181549744" Mar 21 05:05:03 crc kubenswrapper[4580]: I0321 05:05:03.149018 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-kcmc2" podStartSLOduration=1.6724081640000001 podStartE2EDuration="5.1489936s" podCreationTimestamp="2026-03-21 05:04:58 +0000 UTC" firstStartedPulling="2026-03-21 05:04:59.203090702 +0000 UTC m=+804.285674321" lastFinishedPulling="2026-03-21 05:05:02.679676129 +0000 UTC m=+807.762259757" observedRunningTime="2026-03-21 05:05:03.145560768 +0000 UTC m=+808.228144416" watchObservedRunningTime="2026-03-21 05:05:03.1489936 +0000 UTC m=+808.231577228" Mar 21 05:05:08 crc kubenswrapper[4580]: I0321 05:05:08.920936 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-kcmc2" Mar 21 05:05:19 crc kubenswrapper[4580]: I0321 05:05:19.138624 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2pzl9"] Mar 21 05:05:19 crc kubenswrapper[4580]: I0321 05:05:19.139890 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovn-controller" containerID="cri-o://add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc" gracePeriod=30 Mar 21 05:05:19 crc kubenswrapper[4580]: I0321 05:05:19.140019 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="sbdb" containerID="cri-o://df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd" gracePeriod=30 Mar 21 05:05:19 crc kubenswrapper[4580]: I0321 05:05:19.140007 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="nbdb" containerID="cri-o://20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b" gracePeriod=30 Mar 21 05:05:19 crc kubenswrapper[4580]: I0321 05:05:19.140007 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41" gracePeriod=30 Mar 21 05:05:19 crc kubenswrapper[4580]: I0321 05:05:19.140167 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="northd" containerID="cri-o://a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e" gracePeriod=30 Mar 21 05:05:19 crc kubenswrapper[4580]: I0321 05:05:19.140252 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovn-acl-logging" containerID="cri-o://52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee" gracePeriod=30 Mar 21 05:05:19 crc kubenswrapper[4580]: I0321 05:05:19.142405 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="kube-rbac-proxy-node" containerID="cri-o://9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e" gracePeriod=30 Mar 21 05:05:19 crc kubenswrapper[4580]: I0321 05:05:19.181129 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" containerID="cri-o://f3a7b7fc8d7b086453bdbebc04f338be16fb353fd37bc5d175c1797db5d57c46" gracePeriod=30 Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.124633 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovnkube-controller/3.log" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.128076 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovn-acl-logging/0.log" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.128753 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovn-controller/0.log" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.129306 4580 generic.go:334] "Generic (PLEG): container finished" podID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerID="f3a7b7fc8d7b086453bdbebc04f338be16fb353fd37bc5d175c1797db5d57c46" exitCode=0 Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.129339 4580 generic.go:334] "Generic (PLEG): container finished" podID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerID="df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd" exitCode=0 Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.129350 4580 generic.go:334] "Generic (PLEG): container finished" podID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerID="20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b" exitCode=0 Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.129361 4580 generic.go:334] "Generic (PLEG): container finished" podID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerID="a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e" exitCode=0 Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.129372 4580 generic.go:334] "Generic (PLEG): container finished" podID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerID="b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41" exitCode=0 Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.129381 4580 generic.go:334] "Generic (PLEG): container finished" podID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerID="9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e" exitCode=0 Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.129388 4580 generic.go:334] "Generic (PLEG): container finished" podID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerID="52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee" exitCode=143 Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.129447 4580 generic.go:334] "Generic (PLEG): container finished" podID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerID="add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc" exitCode=143 Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.129436 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerDied","Data":"f3a7b7fc8d7b086453bdbebc04f338be16fb353fd37bc5d175c1797db5d57c46"} Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.129596 4580 scope.go:117] "RemoveContainer" containerID="b614d8abd9f22665bc7933716a153495c9da89b7a1806a41874f27e189d5505d" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.130345 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerDied","Data":"df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd"} Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.130391 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerDied","Data":"20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b"} Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.130412 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerDied","Data":"a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e"} Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.130431 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerDied","Data":"b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41"} Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.130449 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerDied","Data":"9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e"} Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.130467 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerDied","Data":"52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee"} Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.130484 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerDied","Data":"add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc"} Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.132320 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z5bcs_f6761e28-8a0c-4ea2-b248-2bd60e3862e6/kube-multus/2.log" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.132991 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z5bcs_f6761e28-8a0c-4ea2-b248-2bd60e3862e6/kube-multus/1.log" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.133039 4580 generic.go:334] "Generic (PLEG): container finished" podID="f6761e28-8a0c-4ea2-b248-2bd60e3862e6" containerID="b9bd2bb0ffd184225a6d57bedbcd4c082b6bce0cbbac8c80394eab05be82361a" exitCode=2 Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.133074 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z5bcs" event={"ID":"f6761e28-8a0c-4ea2-b248-2bd60e3862e6","Type":"ContainerDied","Data":"b9bd2bb0ffd184225a6d57bedbcd4c082b6bce0cbbac8c80394eab05be82361a"} Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.133718 4580 scope.go:117] "RemoveContainer" containerID="b9bd2bb0ffd184225a6d57bedbcd4c082b6bce0cbbac8c80394eab05be82361a" Mar 21 05:05:20 crc kubenswrapper[4580]: E0321 05:05:20.133961 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-z5bcs_openshift-multus(f6761e28-8a0c-4ea2-b248-2bd60e3862e6)\"" pod="openshift-multus/multus-z5bcs" podUID="f6761e28-8a0c-4ea2-b248-2bd60e3862e6" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.266934 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovn-acl-logging/0.log" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.271123 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovn-controller/0.log" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.272309 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.308303 4580 scope.go:117] "RemoveContainer" containerID="54c6c6747eb760f0735d1d9a95c1a7a436737adc6bcf6c2f4ecae6e770b8f6b8" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.345834 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wqmz7"] Mar 21 05:05:20 crc kubenswrapper[4580]: E0321 05:05:20.346370 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.346474 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: E0321 05:05:20.346541 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="kube-rbac-proxy-ovn-metrics" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.346592 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="kube-rbac-proxy-ovn-metrics" Mar 21 05:05:20 crc kubenswrapper[4580]: E0321 05:05:20.346669 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="northd" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.346722 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="northd" Mar 21 05:05:20 crc kubenswrapper[4580]: E0321 05:05:20.346797 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="kubecfg-setup" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.346852 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="kubecfg-setup" Mar 21 05:05:20 crc kubenswrapper[4580]: E0321 05:05:20.347030 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.347085 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: E0321 05:05:20.347148 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.347199 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: E0321 05:05:20.347251 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="kube-rbac-proxy-node" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.347302 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="kube-rbac-proxy-node" Mar 21 05:05:20 crc kubenswrapper[4580]: E0321 05:05:20.347357 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovn-acl-logging" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.347408 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovn-acl-logging" Mar 21 05:05:20 crc kubenswrapper[4580]: E0321 05:05:20.347478 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovn-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.347535 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovn-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: E0321 05:05:20.347589 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="nbdb" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.347641 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="nbdb" Mar 21 05:05:20 crc kubenswrapper[4580]: E0321 05:05:20.347701 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="sbdb" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.347756 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="sbdb" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.347931 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.347990 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.348043 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="nbdb" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.348093 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.348142 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="kube-rbac-proxy-ovn-metrics" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.348210 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.348266 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="sbdb" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.348317 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovn-acl-logging" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.348364 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="kube-rbac-proxy-node" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.348417 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="northd" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.348465 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovn-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: E0321 05:05:20.348651 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.348705 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: E0321 05:05:20.348761 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.348834 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.348986 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" containerName="ovnkube-controller" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.351014 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.372908 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovnkube-script-lib\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373176 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-ovn\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373262 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovnkube-config\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373335 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-run-ovn-kubernetes\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373264 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373459 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-var-lib-openvswitch\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373544 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373447 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373606 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-kubelet\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373635 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373494 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373673 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-slash\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373702 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-cni-netd\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373733 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-node-log\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373765 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-etc-openvswitch\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373764 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373810 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-node-log" (OuterVolumeSpecName: "node-log") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373872 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-env-overrides\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373854 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-slash" (OuterVolumeSpecName: "host-slash") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373896 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373902 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373903 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-log-socket\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373928 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-log-socket" (OuterVolumeSpecName: "log-socket") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373960 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-systemd\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.373987 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-systemd-units\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374013 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovn-node-metrics-cert\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374037 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-run-netns\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374057 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shb8k\" (UniqueName: \"kubernetes.io/projected/2b33648e-09ea-47e5-a32d-8bc5f0209e92-kube-api-access-shb8k\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374190 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-cni-bin\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374198 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374222 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-openvswitch\") pod \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\" (UID: \"2b33648e-09ea-47e5-a32d-8bc5f0209e92\") " Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374346 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374424 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374446 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374467 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374570 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374754 4580 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374849 4580 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374897 4580 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374919 4580 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374934 4580 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374975 4580 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-slash\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.374990 4580 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.375005 4580 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-node-log\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.375007 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.375019 4580 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.375064 4580 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.375080 4580 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-log-socket\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.375094 4580 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.402021 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b33648e-09ea-47e5-a32d-8bc5f0209e92-kube-api-access-shb8k" (OuterVolumeSpecName: "kube-api-access-shb8k") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "kube-api-access-shb8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.402255 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.403052 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "2b33648e-09ea-47e5-a32d-8bc5f0209e92" (UID: "2b33648e-09ea-47e5-a32d-8bc5f0209e92"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.475940 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-var-lib-openvswitch\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476003 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476029 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-run-ovn-kubernetes\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476053 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-run-netns\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476078 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-cni-netd\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476256 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-kubelet\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476314 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-systemd-units\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476347 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-ovn-node-metrics-cert\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476367 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lz9z\" (UniqueName: \"kubernetes.io/projected/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-kube-api-access-9lz9z\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476429 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-run-openvswitch\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476492 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-ovnkube-script-lib\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476536 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-etc-openvswitch\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476557 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-env-overrides\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476610 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-run-ovn\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476633 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-cni-bin\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476654 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-node-log\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476699 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-log-socket\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476719 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-run-systemd\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476742 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-ovnkube-config\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476795 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-slash\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476870 4580 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476882 4580 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476897 4580 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476907 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shb8k\" (UniqueName: \"kubernetes.io/projected/2b33648e-09ea-47e5-a32d-8bc5f0209e92-kube-api-access-shb8k\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476918 4580 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476929 4580 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476938 4580 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2b33648e-09ea-47e5-a32d-8bc5f0209e92-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.476949 4580 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b33648e-09ea-47e5-a32d-8bc5f0209e92-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.578230 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-log-socket\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.578282 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-run-systemd\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.578308 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-ovnkube-config\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.578335 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-slash\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.578422 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-log-socket\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.578463 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-run-systemd\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.578532 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-slash\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.578645 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-var-lib-openvswitch\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579181 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-ovnkube-config\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579260 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-var-lib-openvswitch\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579361 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579391 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-run-netns\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579415 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-run-ovn-kubernetes\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579439 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-cni-netd\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579474 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-kubelet\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579499 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-systemd-units\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579531 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-ovn-node-metrics-cert\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579553 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lz9z\" (UniqueName: \"kubernetes.io/projected/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-kube-api-access-9lz9z\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579601 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-run-openvswitch\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579642 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-ovnkube-script-lib\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579687 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-etc-openvswitch\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579707 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-env-overrides\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579745 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-run-ovn\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579769 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-cni-bin\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579811 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-node-log\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579902 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-node-log\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579958 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.579992 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-run-netns\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.580023 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-run-ovn-kubernetes\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.580053 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-cni-netd\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.580087 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-kubelet\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.580120 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-systemd-units\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.580542 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-etc-openvswitch\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.580656 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-run-ovn\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.580712 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-host-cni-bin\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.580753 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-run-openvswitch\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.581262 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-env-overrides\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.581473 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-ovnkube-script-lib\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.583337 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-ovn-node-metrics-cert\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.605303 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lz9z\" (UniqueName: \"kubernetes.io/projected/65bfbd2d-8acf-4fe3-a61c-dedaf437d269-kube-api-access-9lz9z\") pod \"ovnkube-node-wqmz7\" (UID: \"65bfbd2d-8acf-4fe3-a61c-dedaf437d269\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: I0321 05:05:20.667293 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:20 crc kubenswrapper[4580]: W0321 05:05:20.686374 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65bfbd2d_8acf_4fe3_a61c_dedaf437d269.slice/crio-c791f990f489d714362a247f7ef71485faf1b36d03bc3866d31aac18f18ce061 WatchSource:0}: Error finding container c791f990f489d714362a247f7ef71485faf1b36d03bc3866d31aac18f18ce061: Status 404 returned error can't find the container with id c791f990f489d714362a247f7ef71485faf1b36d03bc3866d31aac18f18ce061 Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.140271 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z5bcs_f6761e28-8a0c-4ea2-b248-2bd60e3862e6/kube-multus/2.log" Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.142230 4580 generic.go:334] "Generic (PLEG): container finished" podID="65bfbd2d-8acf-4fe3-a61c-dedaf437d269" containerID="2a3cbd0f777dc5425e747d73f2952ad9b78aaae4c1b22496f8ce6a32a8dfdee6" exitCode=0 Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.142305 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" event={"ID":"65bfbd2d-8acf-4fe3-a61c-dedaf437d269","Type":"ContainerDied","Data":"2a3cbd0f777dc5425e747d73f2952ad9b78aaae4c1b22496f8ce6a32a8dfdee6"} Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.142350 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" event={"ID":"65bfbd2d-8acf-4fe3-a61c-dedaf437d269","Type":"ContainerStarted","Data":"c791f990f489d714362a247f7ef71485faf1b36d03bc3866d31aac18f18ce061"} Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.149377 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovn-acl-logging/0.log" Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.150196 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pzl9_2b33648e-09ea-47e5-a32d-8bc5f0209e92/ovn-controller/0.log" Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.150610 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" event={"ID":"2b33648e-09ea-47e5-a32d-8bc5f0209e92","Type":"ContainerDied","Data":"82a82ffa831bf919d6f70d37fa1437ec01369365dc51cd7c6e8f31df8b28fcdf"} Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.150663 4580 scope.go:117] "RemoveContainer" containerID="f3a7b7fc8d7b086453bdbebc04f338be16fb353fd37bc5d175c1797db5d57c46" Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.150912 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2pzl9" Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.177389 4580 scope.go:117] "RemoveContainer" containerID="df0abb17eb7b99ae1e1d82e4fffdc255319a2cc8d3690601f88252b86fe136dd" Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.191483 4580 scope.go:117] "RemoveContainer" containerID="20c41d7dc680c08c346c970d840c2983eef7194139fd844a6ceecdd412ae939b" Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.228267 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2pzl9"] Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.229472 4580 scope.go:117] "RemoveContainer" containerID="a81d58163605aa09dd8816b5980c91a2f17bae1071e61985230187f1791a1c1e" Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.237379 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2pzl9"] Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.242225 4580 scope.go:117] "RemoveContainer" containerID="b3f8a054c427699e2614981361fe8c76af6867967db3e19af4818df5b848ac41" Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.255972 4580 scope.go:117] "RemoveContainer" containerID="9bb9d142884cb9fb3aa7ed74056f6e282150e6dbddd96e6d9a5ea70f7573e42e" Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.318537 4580 scope.go:117] "RemoveContainer" containerID="52be3a7ca4bd116d3562338d75f7bd9226760f3f4569d06e2fa42dab4dccceee" Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.336004 4580 scope.go:117] "RemoveContainer" containerID="add44cd2632a2999385f19445d1e1081c65ebe66c3df74b675e6c86c6921becc" Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.357235 4580 scope.go:117] "RemoveContainer" containerID="2338cba87dd5898b9dadd05acb49a9f239b9330bd6ad71fd54ae38b81e830623" Mar 21 05:05:21 crc kubenswrapper[4580]: I0321 05:05:21.628058 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b33648e-09ea-47e5-a32d-8bc5f0209e92" path="/var/lib/kubelet/pods/2b33648e-09ea-47e5-a32d-8bc5f0209e92/volumes" Mar 21 05:05:22 crc kubenswrapper[4580]: I0321 05:05:22.160238 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" event={"ID":"65bfbd2d-8acf-4fe3-a61c-dedaf437d269","Type":"ContainerStarted","Data":"baf6d2b722e4dd16336838c0703a8e9855505440db344ab0eee351217ad83ad7"} Mar 21 05:05:22 crc kubenswrapper[4580]: I0321 05:05:22.160617 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" event={"ID":"65bfbd2d-8acf-4fe3-a61c-dedaf437d269","Type":"ContainerStarted","Data":"b858d27315e961b4bce28a46e47284feff000ebacc9c2c63fd32c27a311b4bf4"} Mar 21 05:05:22 crc kubenswrapper[4580]: I0321 05:05:22.160632 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" event={"ID":"65bfbd2d-8acf-4fe3-a61c-dedaf437d269","Type":"ContainerStarted","Data":"f7f6cd9149a0d0ada657a6a74718eee01a1f54fa2b1679671171b9d94db9c266"} Mar 21 05:05:23 crc kubenswrapper[4580]: I0321 05:05:23.168112 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" event={"ID":"65bfbd2d-8acf-4fe3-a61c-dedaf437d269","Type":"ContainerStarted","Data":"c4dd67205c50e1854e98d0550628dd8e40053af7c842c7e442e6f3f0bde0fea7"} Mar 21 05:05:23 crc kubenswrapper[4580]: I0321 05:05:23.168191 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" event={"ID":"65bfbd2d-8acf-4fe3-a61c-dedaf437d269","Type":"ContainerStarted","Data":"374ee8f8708bbcce3ce5d33769b540e03f44f5cb72e17e826595a88643abe735"} Mar 21 05:05:23 crc kubenswrapper[4580]: I0321 05:05:23.168203 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" event={"ID":"65bfbd2d-8acf-4fe3-a61c-dedaf437d269","Type":"ContainerStarted","Data":"f567cea23a7bf5d8db8d8609f9c992f0bf21799aa9e2775253da8d4497375133"} Mar 21 05:05:25 crc kubenswrapper[4580]: I0321 05:05:25.188259 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" event={"ID":"65bfbd2d-8acf-4fe3-a61c-dedaf437d269","Type":"ContainerStarted","Data":"bdd7e538bb5eebac422eb930c77399af743d005276a946f74b63a5632d2df92f"} Mar 21 05:05:27 crc kubenswrapper[4580]: I0321 05:05:27.203125 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" event={"ID":"65bfbd2d-8acf-4fe3-a61c-dedaf437d269","Type":"ContainerStarted","Data":"4a2e338d69568dd8be7c0a70ecc1caa12eb1c6a5a29967d3ea55a2741e326be3"} Mar 21 05:05:27 crc kubenswrapper[4580]: I0321 05:05:27.203624 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:27 crc kubenswrapper[4580]: I0321 05:05:27.203643 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:27 crc kubenswrapper[4580]: I0321 05:05:27.203653 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:27 crc kubenswrapper[4580]: I0321 05:05:27.255814 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:27 crc kubenswrapper[4580]: I0321 05:05:27.256338 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" podStartSLOduration=7.256324297 podStartE2EDuration="7.256324297s" podCreationTimestamp="2026-03-21 05:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:05:27.249808472 +0000 UTC m=+832.332392100" watchObservedRunningTime="2026-03-21 05:05:27.256324297 +0000 UTC m=+832.338907925" Mar 21 05:05:27 crc kubenswrapper[4580]: I0321 05:05:27.259858 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:34 crc kubenswrapper[4580]: I0321 05:05:34.618453 4580 scope.go:117] "RemoveContainer" containerID="b9bd2bb0ffd184225a6d57bedbcd4c082b6bce0cbbac8c80394eab05be82361a" Mar 21 05:05:35 crc kubenswrapper[4580]: I0321 05:05:35.257266 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z5bcs_f6761e28-8a0c-4ea2-b248-2bd60e3862e6/kube-multus/2.log" Mar 21 05:05:35 crc kubenswrapper[4580]: I0321 05:05:35.257684 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z5bcs" event={"ID":"f6761e28-8a0c-4ea2-b248-2bd60e3862e6","Type":"ContainerStarted","Data":"92d2d5d81a7e4fe4d31ea5bf6bad115c3e58ad17ca1de8b90194e2c82d2c710c"} Mar 21 05:05:50 crc kubenswrapper[4580]: I0321 05:05:50.711506 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wqmz7" Mar 21 05:05:51 crc kubenswrapper[4580]: I0321 05:05:51.123072 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb"] Mar 21 05:05:51 crc kubenswrapper[4580]: I0321 05:05:51.124476 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" Mar 21 05:05:51 crc kubenswrapper[4580]: I0321 05:05:51.129319 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 21 05:05:51 crc kubenswrapper[4580]: I0321 05:05:51.138640 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb"] Mar 21 05:05:51 crc kubenswrapper[4580]: I0321 05:05:51.238357 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb\" (UID: \"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" Mar 21 05:05:51 crc kubenswrapper[4580]: I0321 05:05:51.238435 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6jzn\" (UniqueName: \"kubernetes.io/projected/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-kube-api-access-h6jzn\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb\" (UID: \"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" Mar 21 05:05:51 crc kubenswrapper[4580]: I0321 05:05:51.238466 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb\" (UID: \"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" Mar 21 05:05:51 crc kubenswrapper[4580]: I0321 05:05:51.340178 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb\" (UID: \"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" Mar 21 05:05:51 crc kubenswrapper[4580]: I0321 05:05:51.340585 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6jzn\" (UniqueName: \"kubernetes.io/projected/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-kube-api-access-h6jzn\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb\" (UID: \"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" Mar 21 05:05:51 crc kubenswrapper[4580]: I0321 05:05:51.340707 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb\" (UID: \"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" Mar 21 05:05:51 crc kubenswrapper[4580]: I0321 05:05:51.340887 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb\" (UID: \"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" Mar 21 05:05:51 crc kubenswrapper[4580]: I0321 05:05:51.341309 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb\" (UID: \"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" Mar 21 05:05:51 crc kubenswrapper[4580]: I0321 05:05:51.366261 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6jzn\" (UniqueName: \"kubernetes.io/projected/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-kube-api-access-h6jzn\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb\" (UID: \"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" Mar 21 05:05:51 crc kubenswrapper[4580]: I0321 05:05:51.444949 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" Mar 21 05:05:51 crc kubenswrapper[4580]: I0321 05:05:51.674107 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb"] Mar 21 05:05:52 crc kubenswrapper[4580]: I0321 05:05:52.375195 4580 generic.go:334] "Generic (PLEG): container finished" podID="20e4a1fa-c6ce-4a58-a9b9-a982a19c5243" containerID="1a4454703b5f0ef2b3094f9d0c4b33084a00f09cf9747147a387e391079f50f4" exitCode=0 Mar 21 05:05:52 crc kubenswrapper[4580]: I0321 05:05:52.375262 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" event={"ID":"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243","Type":"ContainerDied","Data":"1a4454703b5f0ef2b3094f9d0c4b33084a00f09cf9747147a387e391079f50f4"} Mar 21 05:05:52 crc kubenswrapper[4580]: I0321 05:05:52.375665 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" event={"ID":"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243","Type":"ContainerStarted","Data":"ee62984d12ee536cbae95ac83cf33ceb7933ffddf5d6d40033bb40b14df17334"} Mar 21 05:05:54 crc kubenswrapper[4580]: I0321 05:05:54.394044 4580 generic.go:334] "Generic (PLEG): container finished" podID="20e4a1fa-c6ce-4a58-a9b9-a982a19c5243" containerID="959ad17439085f48bb4167b4a308c7cb18c87339d307e89a89d408d02f0feeb7" exitCode=0 Mar 21 05:05:54 crc kubenswrapper[4580]: I0321 05:05:54.394967 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" event={"ID":"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243","Type":"ContainerDied","Data":"959ad17439085f48bb4167b4a308c7cb18c87339d307e89a89d408d02f0feeb7"} Mar 21 05:05:54 crc kubenswrapper[4580]: I0321 05:05:54.651948 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vwmlz"] Mar 21 05:05:54 crc kubenswrapper[4580]: I0321 05:05:54.656153 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:05:54 crc kubenswrapper[4580]: I0321 05:05:54.674800 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vwmlz"] Mar 21 05:05:54 crc kubenswrapper[4580]: I0321 05:05:54.794979 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-catalog-content\") pod \"redhat-operators-vwmlz\" (UID: \"3a26cc0d-3ca6-4fa4-98dc-554493370fa6\") " pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:05:54 crc kubenswrapper[4580]: I0321 05:05:54.795095 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-utilities\") pod \"redhat-operators-vwmlz\" (UID: \"3a26cc0d-3ca6-4fa4-98dc-554493370fa6\") " pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:05:54 crc kubenswrapper[4580]: I0321 05:05:54.795143 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l7cp\" (UniqueName: \"kubernetes.io/projected/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-kube-api-access-9l7cp\") pod \"redhat-operators-vwmlz\" (UID: \"3a26cc0d-3ca6-4fa4-98dc-554493370fa6\") " pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:05:54 crc kubenswrapper[4580]: I0321 05:05:54.896630 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-catalog-content\") pod \"redhat-operators-vwmlz\" (UID: \"3a26cc0d-3ca6-4fa4-98dc-554493370fa6\") " pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:05:54 crc kubenswrapper[4580]: I0321 05:05:54.896711 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-utilities\") pod \"redhat-operators-vwmlz\" (UID: \"3a26cc0d-3ca6-4fa4-98dc-554493370fa6\") " pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:05:54 crc kubenswrapper[4580]: I0321 05:05:54.896737 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l7cp\" (UniqueName: \"kubernetes.io/projected/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-kube-api-access-9l7cp\") pod \"redhat-operators-vwmlz\" (UID: \"3a26cc0d-3ca6-4fa4-98dc-554493370fa6\") " pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:05:54 crc kubenswrapper[4580]: I0321 05:05:54.897438 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-catalog-content\") pod \"redhat-operators-vwmlz\" (UID: \"3a26cc0d-3ca6-4fa4-98dc-554493370fa6\") " pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:05:54 crc kubenswrapper[4580]: I0321 05:05:54.897526 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-utilities\") pod \"redhat-operators-vwmlz\" (UID: \"3a26cc0d-3ca6-4fa4-98dc-554493370fa6\") " pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:05:54 crc kubenswrapper[4580]: I0321 05:05:54.929110 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l7cp\" (UniqueName: \"kubernetes.io/projected/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-kube-api-access-9l7cp\") pod \"redhat-operators-vwmlz\" (UID: \"3a26cc0d-3ca6-4fa4-98dc-554493370fa6\") " pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:05:55 crc kubenswrapper[4580]: I0321 05:05:55.006893 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:05:55 crc kubenswrapper[4580]: I0321 05:05:55.405171 4580 generic.go:334] "Generic (PLEG): container finished" podID="20e4a1fa-c6ce-4a58-a9b9-a982a19c5243" containerID="b253d6092a0197bb2bc706fe38df269bec0d9bb54a17376e4cd3a4cbaa194dd1" exitCode=0 Mar 21 05:05:55 crc kubenswrapper[4580]: I0321 05:05:55.405285 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" event={"ID":"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243","Type":"ContainerDied","Data":"b253d6092a0197bb2bc706fe38df269bec0d9bb54a17376e4cd3a4cbaa194dd1"} Mar 21 05:05:55 crc kubenswrapper[4580]: I0321 05:05:55.457255 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vwmlz"] Mar 21 05:05:56 crc kubenswrapper[4580]: I0321 05:05:56.412640 4580 generic.go:334] "Generic (PLEG): container finished" podID="3a26cc0d-3ca6-4fa4-98dc-554493370fa6" containerID="a6a230cfb7f1ecb7f1f72884b0fcd43aae99b2a5b8769d80f7163b7b3005639a" exitCode=0 Mar 21 05:05:56 crc kubenswrapper[4580]: I0321 05:05:56.412934 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwmlz" event={"ID":"3a26cc0d-3ca6-4fa4-98dc-554493370fa6","Type":"ContainerDied","Data":"a6a230cfb7f1ecb7f1f72884b0fcd43aae99b2a5b8769d80f7163b7b3005639a"} Mar 21 05:05:56 crc kubenswrapper[4580]: I0321 05:05:56.413312 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwmlz" event={"ID":"3a26cc0d-3ca6-4fa4-98dc-554493370fa6","Type":"ContainerStarted","Data":"ac7726ed82c569259f45fec1fe7bbce9fcd3f777bf0b71767b525aba430c3865"} Mar 21 05:05:56 crc kubenswrapper[4580]: I0321 05:05:56.660300 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" Mar 21 05:05:56 crc kubenswrapper[4580]: I0321 05:05:56.823977 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-util\") pod \"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243\" (UID: \"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243\") " Mar 21 05:05:56 crc kubenswrapper[4580]: I0321 05:05:56.824086 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-bundle\") pod \"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243\" (UID: \"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243\") " Mar 21 05:05:56 crc kubenswrapper[4580]: I0321 05:05:56.824209 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6jzn\" (UniqueName: \"kubernetes.io/projected/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-kube-api-access-h6jzn\") pod \"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243\" (UID: \"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243\") " Mar 21 05:05:56 crc kubenswrapper[4580]: I0321 05:05:56.825020 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-bundle" (OuterVolumeSpecName: "bundle") pod "20e4a1fa-c6ce-4a58-a9b9-a982a19c5243" (UID: "20e4a1fa-c6ce-4a58-a9b9-a982a19c5243"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:05:56 crc kubenswrapper[4580]: I0321 05:05:56.831263 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-kube-api-access-h6jzn" (OuterVolumeSpecName: "kube-api-access-h6jzn") pod "20e4a1fa-c6ce-4a58-a9b9-a982a19c5243" (UID: "20e4a1fa-c6ce-4a58-a9b9-a982a19c5243"). InnerVolumeSpecName "kube-api-access-h6jzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:05:56 crc kubenswrapper[4580]: I0321 05:05:56.844941 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-util" (OuterVolumeSpecName: "util") pod "20e4a1fa-c6ce-4a58-a9b9-a982a19c5243" (UID: "20e4a1fa-c6ce-4a58-a9b9-a982a19c5243"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:05:56 crc kubenswrapper[4580]: I0321 05:05:56.926413 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6jzn\" (UniqueName: \"kubernetes.io/projected/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-kube-api-access-h6jzn\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:56 crc kubenswrapper[4580]: I0321 05:05:56.926468 4580 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-util\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:56 crc kubenswrapper[4580]: I0321 05:05:56.926481 4580 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20e4a1fa-c6ce-4a58-a9b9-a982a19c5243-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:57 crc kubenswrapper[4580]: I0321 05:05:57.422942 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" event={"ID":"20e4a1fa-c6ce-4a58-a9b9-a982a19c5243","Type":"ContainerDied","Data":"ee62984d12ee536cbae95ac83cf33ceb7933ffddf5d6d40033bb40b14df17334"} Mar 21 05:05:57 crc kubenswrapper[4580]: I0321 05:05:57.423039 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee62984d12ee536cbae95ac83cf33ceb7933ffddf5d6d40033bb40b14df17334" Mar 21 05:05:57 crc kubenswrapper[4580]: I0321 05:05:57.422979 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb" Mar 21 05:05:57 crc kubenswrapper[4580]: I0321 05:05:57.426864 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwmlz" event={"ID":"3a26cc0d-3ca6-4fa4-98dc-554493370fa6","Type":"ContainerStarted","Data":"e0d365545a5a41bdd8a37b83317857b41abd1c3b7526a9b685004a98ff6c5b2f"} Mar 21 05:05:58 crc kubenswrapper[4580]: I0321 05:05:58.437728 4580 generic.go:334] "Generic (PLEG): container finished" podID="3a26cc0d-3ca6-4fa4-98dc-554493370fa6" containerID="e0d365545a5a41bdd8a37b83317857b41abd1c3b7526a9b685004a98ff6c5b2f" exitCode=0 Mar 21 05:05:58 crc kubenswrapper[4580]: I0321 05:05:58.437921 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwmlz" event={"ID":"3a26cc0d-3ca6-4fa4-98dc-554493370fa6","Type":"ContainerDied","Data":"e0d365545a5a41bdd8a37b83317857b41abd1c3b7526a9b685004a98ff6c5b2f"} Mar 21 05:05:59 crc kubenswrapper[4580]: I0321 05:05:59.445733 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwmlz" event={"ID":"3a26cc0d-3ca6-4fa4-98dc-554493370fa6","Type":"ContainerStarted","Data":"1682e41c7116ca20a7f2ba051d71fd99dfe7ab84bde7de6834a127917d448015"} Mar 21 05:05:59 crc kubenswrapper[4580]: I0321 05:05:59.471092 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vwmlz" podStartSLOduration=3.054122698 podStartE2EDuration="5.471065417s" podCreationTimestamp="2026-03-21 05:05:54 +0000 UTC" firstStartedPulling="2026-03-21 05:05:56.414657036 +0000 UTC m=+861.497240664" lastFinishedPulling="2026-03-21 05:05:58.831599755 +0000 UTC m=+863.914183383" observedRunningTime="2026-03-21 05:05:59.467138583 +0000 UTC m=+864.549722241" watchObservedRunningTime="2026-03-21 05:05:59.471065417 +0000 UTC m=+864.553649035" Mar 21 05:05:59 crc kubenswrapper[4580]: I0321 05:05:59.768988 4580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 21 05:06:00 crc kubenswrapper[4580]: I0321 05:06:00.154387 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567826-5svml"] Mar 21 05:06:00 crc kubenswrapper[4580]: E0321 05:06:00.154844 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e4a1fa-c6ce-4a58-a9b9-a982a19c5243" containerName="pull" Mar 21 05:06:00 crc kubenswrapper[4580]: I0321 05:06:00.154869 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e4a1fa-c6ce-4a58-a9b9-a982a19c5243" containerName="pull" Mar 21 05:06:00 crc kubenswrapper[4580]: E0321 05:06:00.154890 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e4a1fa-c6ce-4a58-a9b9-a982a19c5243" containerName="util" Mar 21 05:06:00 crc kubenswrapper[4580]: I0321 05:06:00.154912 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e4a1fa-c6ce-4a58-a9b9-a982a19c5243" containerName="util" Mar 21 05:06:00 crc kubenswrapper[4580]: E0321 05:06:00.154937 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e4a1fa-c6ce-4a58-a9b9-a982a19c5243" containerName="extract" Mar 21 05:06:00 crc kubenswrapper[4580]: I0321 05:06:00.154945 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e4a1fa-c6ce-4a58-a9b9-a982a19c5243" containerName="extract" Mar 21 05:06:00 crc kubenswrapper[4580]: I0321 05:06:00.155108 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e4a1fa-c6ce-4a58-a9b9-a982a19c5243" containerName="extract" Mar 21 05:06:00 crc kubenswrapper[4580]: I0321 05:06:00.155735 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-5svml" Mar 21 05:06:00 crc kubenswrapper[4580]: I0321 05:06:00.166289 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:06:00 crc kubenswrapper[4580]: I0321 05:06:00.166579 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:06:00 crc kubenswrapper[4580]: I0321 05:06:00.166643 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-5svml"] Mar 21 05:06:00 crc kubenswrapper[4580]: I0321 05:06:00.166904 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:06:00 crc kubenswrapper[4580]: I0321 05:06:00.273060 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zp9x\" (UniqueName: \"kubernetes.io/projected/53dd2f4a-1fb1-43a5-b164-8de12304064a-kube-api-access-2zp9x\") pod \"auto-csr-approver-29567826-5svml\" (UID: \"53dd2f4a-1fb1-43a5-b164-8de12304064a\") " pod="openshift-infra/auto-csr-approver-29567826-5svml" Mar 21 05:06:00 crc kubenswrapper[4580]: I0321 05:06:00.374971 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zp9x\" (UniqueName: \"kubernetes.io/projected/53dd2f4a-1fb1-43a5-b164-8de12304064a-kube-api-access-2zp9x\") pod \"auto-csr-approver-29567826-5svml\" (UID: \"53dd2f4a-1fb1-43a5-b164-8de12304064a\") " pod="openshift-infra/auto-csr-approver-29567826-5svml" Mar 21 05:06:00 crc kubenswrapper[4580]: I0321 05:06:00.396332 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zp9x\" (UniqueName: \"kubernetes.io/projected/53dd2f4a-1fb1-43a5-b164-8de12304064a-kube-api-access-2zp9x\") pod \"auto-csr-approver-29567826-5svml\" (UID: \"53dd2f4a-1fb1-43a5-b164-8de12304064a\") " pod="openshift-infra/auto-csr-approver-29567826-5svml" Mar 21 05:06:00 crc kubenswrapper[4580]: I0321 05:06:00.477082 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-5svml" Mar 21 05:06:00 crc kubenswrapper[4580]: I0321 05:06:00.918008 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-5svml"] Mar 21 05:06:00 crc kubenswrapper[4580]: W0321 05:06:00.918472 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53dd2f4a_1fb1_43a5_b164_8de12304064a.slice/crio-42e8f8f5cad5d8addec6221c60fc9e1553829dadadd4814456659dce25d091b8 WatchSource:0}: Error finding container 42e8f8f5cad5d8addec6221c60fc9e1553829dadadd4814456659dce25d091b8: Status 404 returned error can't find the container with id 42e8f8f5cad5d8addec6221c60fc9e1553829dadadd4814456659dce25d091b8 Mar 21 05:06:01 crc kubenswrapper[4580]: I0321 05:06:01.456772 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567826-5svml" event={"ID":"53dd2f4a-1fb1-43a5-b164-8de12304064a","Type":"ContainerStarted","Data":"42e8f8f5cad5d8addec6221c60fc9e1553829dadadd4814456659dce25d091b8"} Mar 21 05:06:01 crc kubenswrapper[4580]: I0321 05:06:01.531979 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-2fctj"] Mar 21 05:06:01 crc kubenswrapper[4580]: I0321 05:06:01.532964 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2fctj" Mar 21 05:06:01 crc kubenswrapper[4580]: I0321 05:06:01.537895 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 21 05:06:01 crc kubenswrapper[4580]: I0321 05:06:01.537985 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-ws775" Mar 21 05:06:01 crc kubenswrapper[4580]: I0321 05:06:01.538158 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 21 05:06:01 crc kubenswrapper[4580]: I0321 05:06:01.562045 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-2fctj"] Mar 21 05:06:01 crc kubenswrapper[4580]: I0321 05:06:01.592564 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmnt2\" (UniqueName: \"kubernetes.io/projected/b980d3d5-4f25-47c0-9679-8662b237e1b7-kube-api-access-hmnt2\") pod \"nmstate-operator-796d4cfff4-2fctj\" (UID: \"b980d3d5-4f25-47c0-9679-8662b237e1b7\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-2fctj" Mar 21 05:06:01 crc kubenswrapper[4580]: I0321 05:06:01.693513 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmnt2\" (UniqueName: \"kubernetes.io/projected/b980d3d5-4f25-47c0-9679-8662b237e1b7-kube-api-access-hmnt2\") pod \"nmstate-operator-796d4cfff4-2fctj\" (UID: \"b980d3d5-4f25-47c0-9679-8662b237e1b7\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-2fctj" Mar 21 05:06:01 crc kubenswrapper[4580]: I0321 05:06:01.723007 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmnt2\" (UniqueName: \"kubernetes.io/projected/b980d3d5-4f25-47c0-9679-8662b237e1b7-kube-api-access-hmnt2\") pod \"nmstate-operator-796d4cfff4-2fctj\" (UID: \"b980d3d5-4f25-47c0-9679-8662b237e1b7\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-2fctj" Mar 21 05:06:01 crc kubenswrapper[4580]: I0321 05:06:01.852680 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2fctj" Mar 21 05:06:02 crc kubenswrapper[4580]: I0321 05:06:02.184001 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-2fctj"] Mar 21 05:06:02 crc kubenswrapper[4580]: I0321 05:06:02.465932 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567826-5svml" event={"ID":"53dd2f4a-1fb1-43a5-b164-8de12304064a","Type":"ContainerStarted","Data":"3231a6ad006cb3b4b5b09cbcab62f71f5d2bcef3b7c14fc34a9085fc8cdeff20"} Mar 21 05:06:02 crc kubenswrapper[4580]: I0321 05:06:02.467725 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2fctj" event={"ID":"b980d3d5-4f25-47c0-9679-8662b237e1b7","Type":"ContainerStarted","Data":"c947eda006f0acf548c37d0ae63c4b4b37a71cbf10eb464bc64ef76d10a2f97d"} Mar 21 05:06:02 crc kubenswrapper[4580]: I0321 05:06:02.487926 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567826-5svml" podStartSLOduration=1.583180643 podStartE2EDuration="2.48790005s" podCreationTimestamp="2026-03-21 05:06:00 +0000 UTC" firstStartedPulling="2026-03-21 05:06:00.9209704 +0000 UTC m=+866.003554028" lastFinishedPulling="2026-03-21 05:06:01.825689807 +0000 UTC m=+866.908273435" observedRunningTime="2026-03-21 05:06:02.483227677 +0000 UTC m=+867.565811305" watchObservedRunningTime="2026-03-21 05:06:02.48790005 +0000 UTC m=+867.570483678" Mar 21 05:06:03 crc kubenswrapper[4580]: I0321 05:06:03.476799 4580 generic.go:334] "Generic (PLEG): container finished" podID="53dd2f4a-1fb1-43a5-b164-8de12304064a" containerID="3231a6ad006cb3b4b5b09cbcab62f71f5d2bcef3b7c14fc34a9085fc8cdeff20" exitCode=0 Mar 21 05:06:03 crc kubenswrapper[4580]: I0321 05:06:03.476896 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567826-5svml" event={"ID":"53dd2f4a-1fb1-43a5-b164-8de12304064a","Type":"ContainerDied","Data":"3231a6ad006cb3b4b5b09cbcab62f71f5d2bcef3b7c14fc34a9085fc8cdeff20"} Mar 21 05:06:04 crc kubenswrapper[4580]: I0321 05:06:04.903526 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-5svml" Mar 21 05:06:04 crc kubenswrapper[4580]: I0321 05:06:04.952701 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zp9x\" (UniqueName: \"kubernetes.io/projected/53dd2f4a-1fb1-43a5-b164-8de12304064a-kube-api-access-2zp9x\") pod \"53dd2f4a-1fb1-43a5-b164-8de12304064a\" (UID: \"53dd2f4a-1fb1-43a5-b164-8de12304064a\") " Mar 21 05:06:04 crc kubenswrapper[4580]: I0321 05:06:04.965084 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53dd2f4a-1fb1-43a5-b164-8de12304064a-kube-api-access-2zp9x" (OuterVolumeSpecName: "kube-api-access-2zp9x") pod "53dd2f4a-1fb1-43a5-b164-8de12304064a" (UID: "53dd2f4a-1fb1-43a5-b164-8de12304064a"). InnerVolumeSpecName "kube-api-access-2zp9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:05 crc kubenswrapper[4580]: I0321 05:06:05.007664 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:06:05 crc kubenswrapper[4580]: I0321 05:06:05.009119 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:06:05 crc kubenswrapper[4580]: I0321 05:06:05.054433 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zp9x\" (UniqueName: \"kubernetes.io/projected/53dd2f4a-1fb1-43a5-b164-8de12304064a-kube-api-access-2zp9x\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:05 crc kubenswrapper[4580]: I0321 05:06:05.493403 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567826-5svml" event={"ID":"53dd2f4a-1fb1-43a5-b164-8de12304064a","Type":"ContainerDied","Data":"42e8f8f5cad5d8addec6221c60fc9e1553829dadadd4814456659dce25d091b8"} Mar 21 05:06:05 crc kubenswrapper[4580]: I0321 05:06:05.493795 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42e8f8f5cad5d8addec6221c60fc9e1553829dadadd4814456659dce25d091b8" Mar 21 05:06:05 crc kubenswrapper[4580]: I0321 05:06:05.493869 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-5svml" Mar 21 05:06:05 crc kubenswrapper[4580]: I0321 05:06:05.501600 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2fctj" event={"ID":"b980d3d5-4f25-47c0-9679-8662b237e1b7","Type":"ContainerStarted","Data":"820cdeaf9aaa6a0d6d55ad45d62f086d3c87413440c1ed77b169914e7cf4132e"} Mar 21 05:06:05 crc kubenswrapper[4580]: I0321 05:06:05.529342 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2fctj" podStartSLOduration=1.7844308519999998 podStartE2EDuration="4.529312004s" podCreationTimestamp="2026-03-21 05:06:01 +0000 UTC" firstStartedPulling="2026-03-21 05:06:02.196296018 +0000 UTC m=+867.278879646" lastFinishedPulling="2026-03-21 05:06:04.94117717 +0000 UTC m=+870.023760798" observedRunningTime="2026-03-21 05:06:05.523981313 +0000 UTC m=+870.606564961" watchObservedRunningTime="2026-03-21 05:06:05.529312004 +0000 UTC m=+870.611895632" Mar 21 05:06:05 crc kubenswrapper[4580]: I0321 05:06:05.579645 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-r9xz9"] Mar 21 05:06:05 crc kubenswrapper[4580]: I0321 05:06:05.583374 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-r9xz9"] Mar 21 05:06:05 crc kubenswrapper[4580]: I0321 05:06:05.625104 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9ad2cf4-f24a-42b2-a0ac-9793f82d3405" path="/var/lib/kubelet/pods/a9ad2cf4-f24a-42b2-a0ac-9793f82d3405/volumes" Mar 21 05:06:06 crc kubenswrapper[4580]: I0321 05:06:06.061703 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vwmlz" podUID="3a26cc0d-3ca6-4fa4-98dc-554493370fa6" containerName="registry-server" probeResult="failure" output=< Mar 21 05:06:06 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:06:06 crc kubenswrapper[4580]: > Mar 21 05:06:15 crc kubenswrapper[4580]: I0321 05:06:15.064158 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:06:15 crc kubenswrapper[4580]: I0321 05:06:15.113540 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:06:15 crc kubenswrapper[4580]: I0321 05:06:15.300904 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vwmlz"] Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.566166 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-zfbrs"] Mar 21 05:06:16 crc kubenswrapper[4580]: E0321 05:06:16.566457 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53dd2f4a-1fb1-43a5-b164-8de12304064a" containerName="oc" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.566473 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="53dd2f4a-1fb1-43a5-b164-8de12304064a" containerName="oc" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.566602 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="53dd2f4a-1fb1-43a5-b164-8de12304064a" containerName="oc" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.567324 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zfbrs" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.573719 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vwmlz" podUID="3a26cc0d-3ca6-4fa4-98dc-554493370fa6" containerName="registry-server" containerID="cri-o://1682e41c7116ca20a7f2ba051d71fd99dfe7ab84bde7de6834a127917d448015" gracePeriod=2 Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.579834 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-k7dnr" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.595513 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf"] Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.596445 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.598528 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.610577 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/449ae922-f55a-437e-b18a-d6e2700cc02e-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-w4mjf\" (UID: \"449ae922-f55a-437e-b18a-d6e2700cc02e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.610662 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5z7k\" (UniqueName: \"kubernetes.io/projected/1b699e53-ece3-49d9-9f68-c3558aef7892-kube-api-access-r5z7k\") pod \"nmstate-metrics-9b8c8685d-zfbrs\" (UID: \"1b699e53-ece3-49d9-9f68-c3558aef7892\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zfbrs" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.611052 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfb9w\" (UniqueName: \"kubernetes.io/projected/449ae922-f55a-437e-b18a-d6e2700cc02e-kube-api-access-rfb9w\") pod \"nmstate-webhook-5f558f5558-w4mjf\" (UID: \"449ae922-f55a-437e-b18a-d6e2700cc02e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.616368 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-zfbrs"] Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.623179 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-ftlkm"] Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.624188 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.634765 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf"] Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.713992 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/449ae922-f55a-437e-b18a-d6e2700cc02e-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-w4mjf\" (UID: \"449ae922-f55a-437e-b18a-d6e2700cc02e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.714558 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5z7k\" (UniqueName: \"kubernetes.io/projected/1b699e53-ece3-49d9-9f68-c3558aef7892-kube-api-access-r5z7k\") pod \"nmstate-metrics-9b8c8685d-zfbrs\" (UID: \"1b699e53-ece3-49d9-9f68-c3558aef7892\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zfbrs" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.714770 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfb9w\" (UniqueName: \"kubernetes.io/projected/449ae922-f55a-437e-b18a-d6e2700cc02e-kube-api-access-rfb9w\") pod \"nmstate-webhook-5f558f5558-w4mjf\" (UID: \"449ae922-f55a-437e-b18a-d6e2700cc02e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf" Mar 21 05:06:16 crc kubenswrapper[4580]: E0321 05:06:16.714220 4580 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 21 05:06:16 crc kubenswrapper[4580]: E0321 05:06:16.716771 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/449ae922-f55a-437e-b18a-d6e2700cc02e-tls-key-pair podName:449ae922-f55a-437e-b18a-d6e2700cc02e nodeName:}" failed. No retries permitted until 2026-03-21 05:06:17.216717988 +0000 UTC m=+882.299301776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/449ae922-f55a-437e-b18a-d6e2700cc02e-tls-key-pair") pod "nmstate-webhook-5f558f5558-w4mjf" (UID: "449ae922-f55a-437e-b18a-d6e2700cc02e") : secret "openshift-nmstate-webhook" not found Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.745224 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfb9w\" (UniqueName: \"kubernetes.io/projected/449ae922-f55a-437e-b18a-d6e2700cc02e-kube-api-access-rfb9w\") pod \"nmstate-webhook-5f558f5558-w4mjf\" (UID: \"449ae922-f55a-437e-b18a-d6e2700cc02e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.779971 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5z7k\" (UniqueName: \"kubernetes.io/projected/1b699e53-ece3-49d9-9f68-c3558aef7892-kube-api-access-r5z7k\") pod \"nmstate-metrics-9b8c8685d-zfbrs\" (UID: \"1b699e53-ece3-49d9-9f68-c3558aef7892\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zfbrs" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.812161 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s"] Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.813305 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.815124 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3f4e68cb-70a1-40bd-815d-e35e0a3337a0-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-6t69s\" (UID: \"3f4e68cb-70a1-40bd-815d-e35e0a3337a0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.815182 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4e68cb-70a1-40bd-815d-e35e0a3337a0-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-6t69s\" (UID: \"3f4e68cb-70a1-40bd-815d-e35e0a3337a0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.815285 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c458e5d3-8c0a-4135-aba8-54854b16c411-dbus-socket\") pod \"nmstate-handler-ftlkm\" (UID: \"c458e5d3-8c0a-4135-aba8-54854b16c411\") " pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.815364 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7d2l\" (UniqueName: \"kubernetes.io/projected/3f4e68cb-70a1-40bd-815d-e35e0a3337a0-kube-api-access-v7d2l\") pod \"nmstate-console-plugin-86f58fcf4-6t69s\" (UID: \"3f4e68cb-70a1-40bd-815d-e35e0a3337a0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.815397 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c458e5d3-8c0a-4135-aba8-54854b16c411-ovs-socket\") pod \"nmstate-handler-ftlkm\" (UID: \"c458e5d3-8c0a-4135-aba8-54854b16c411\") " pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.815697 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qpfs\" (UniqueName: \"kubernetes.io/projected/c458e5d3-8c0a-4135-aba8-54854b16c411-kube-api-access-8qpfs\") pod \"nmstate-handler-ftlkm\" (UID: \"c458e5d3-8c0a-4135-aba8-54854b16c411\") " pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.815868 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c458e5d3-8c0a-4135-aba8-54854b16c411-nmstate-lock\") pod \"nmstate-handler-ftlkm\" (UID: \"c458e5d3-8c0a-4135-aba8-54854b16c411\") " pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.822008 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.822389 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.823101 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-ksc2f" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.847555 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s"] Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.889628 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zfbrs" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.917124 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qpfs\" (UniqueName: \"kubernetes.io/projected/c458e5d3-8c0a-4135-aba8-54854b16c411-kube-api-access-8qpfs\") pod \"nmstate-handler-ftlkm\" (UID: \"c458e5d3-8c0a-4135-aba8-54854b16c411\") " pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.917197 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c458e5d3-8c0a-4135-aba8-54854b16c411-nmstate-lock\") pod \"nmstate-handler-ftlkm\" (UID: \"c458e5d3-8c0a-4135-aba8-54854b16c411\") " pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.917230 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3f4e68cb-70a1-40bd-815d-e35e0a3337a0-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-6t69s\" (UID: \"3f4e68cb-70a1-40bd-815d-e35e0a3337a0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.917248 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4e68cb-70a1-40bd-815d-e35e0a3337a0-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-6t69s\" (UID: \"3f4e68cb-70a1-40bd-815d-e35e0a3337a0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.917274 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c458e5d3-8c0a-4135-aba8-54854b16c411-dbus-socket\") pod \"nmstate-handler-ftlkm\" (UID: \"c458e5d3-8c0a-4135-aba8-54854b16c411\") " pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.917301 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7d2l\" (UniqueName: \"kubernetes.io/projected/3f4e68cb-70a1-40bd-815d-e35e0a3337a0-kube-api-access-v7d2l\") pod \"nmstate-console-plugin-86f58fcf4-6t69s\" (UID: \"3f4e68cb-70a1-40bd-815d-e35e0a3337a0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.917323 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c458e5d3-8c0a-4135-aba8-54854b16c411-ovs-socket\") pod \"nmstate-handler-ftlkm\" (UID: \"c458e5d3-8c0a-4135-aba8-54854b16c411\") " pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.917425 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c458e5d3-8c0a-4135-aba8-54854b16c411-ovs-socket\") pod \"nmstate-handler-ftlkm\" (UID: \"c458e5d3-8c0a-4135-aba8-54854b16c411\") " pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.917472 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c458e5d3-8c0a-4135-aba8-54854b16c411-nmstate-lock\") pod \"nmstate-handler-ftlkm\" (UID: \"c458e5d3-8c0a-4135-aba8-54854b16c411\") " pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.917924 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c458e5d3-8c0a-4135-aba8-54854b16c411-dbus-socket\") pod \"nmstate-handler-ftlkm\" (UID: \"c458e5d3-8c0a-4135-aba8-54854b16c411\") " pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:16 crc kubenswrapper[4580]: E0321 05:06:16.917996 4580 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 21 05:06:16 crc kubenswrapper[4580]: E0321 05:06:16.918060 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f4e68cb-70a1-40bd-815d-e35e0a3337a0-plugin-serving-cert podName:3f4e68cb-70a1-40bd-815d-e35e0a3337a0 nodeName:}" failed. No retries permitted until 2026-03-21 05:06:17.418046803 +0000 UTC m=+882.500630431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/3f4e68cb-70a1-40bd-815d-e35e0a3337a0-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-6t69s" (UID: "3f4e68cb-70a1-40bd-815d-e35e0a3337a0") : secret "plugin-serving-cert" not found Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.918401 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3f4e68cb-70a1-40bd-815d-e35e0a3337a0-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-6t69s\" (UID: \"3f4e68cb-70a1-40bd-815d-e35e0a3337a0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.945095 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7d2l\" (UniqueName: \"kubernetes.io/projected/3f4e68cb-70a1-40bd-815d-e35e0a3337a0-kube-api-access-v7d2l\") pod \"nmstate-console-plugin-86f58fcf4-6t69s\" (UID: \"3f4e68cb-70a1-40bd-815d-e35e0a3337a0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.955220 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qpfs\" (UniqueName: \"kubernetes.io/projected/c458e5d3-8c0a-4135-aba8-54854b16c411-kube-api-access-8qpfs\") pod \"nmstate-handler-ftlkm\" (UID: \"c458e5d3-8c0a-4135-aba8-54854b16c411\") " pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:16 crc kubenswrapper[4580]: I0321 05:06:16.961911 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.045773 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-67bd66fdcb-4l9c5"] Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.046634 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.078405 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67bd66fdcb-4l9c5"] Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.225918 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/538e9507-45ba-4586-b08d-7bbf295e47af-oauth-serving-cert\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.226462 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/538e9507-45ba-4586-b08d-7bbf295e47af-trusted-ca-bundle\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.226516 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/538e9507-45ba-4586-b08d-7bbf295e47af-service-ca\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.226587 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/538e9507-45ba-4586-b08d-7bbf295e47af-console-serving-cert\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.226610 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/538e9507-45ba-4586-b08d-7bbf295e47af-console-config\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.226630 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/538e9507-45ba-4586-b08d-7bbf295e47af-console-oauth-config\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.226658 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wrtj\" (UniqueName: \"kubernetes.io/projected/538e9507-45ba-4586-b08d-7bbf295e47af-kube-api-access-7wrtj\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.226703 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/449ae922-f55a-437e-b18a-d6e2700cc02e-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-w4mjf\" (UID: \"449ae922-f55a-437e-b18a-d6e2700cc02e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.236586 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/449ae922-f55a-437e-b18a-d6e2700cc02e-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-w4mjf\" (UID: \"449ae922-f55a-437e-b18a-d6e2700cc02e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf" Mar 21 05:06:17 crc kubenswrapper[4580]: W0321 05:06:17.259582 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b699e53_ece3_49d9_9f68_c3558aef7892.slice/crio-aab93662c099fd9e50b153d430546dab33154792ff3e19761523f1026d5dcd1b WatchSource:0}: Error finding container aab93662c099fd9e50b153d430546dab33154792ff3e19761523f1026d5dcd1b: Status 404 returned error can't find the container with id aab93662c099fd9e50b153d430546dab33154792ff3e19761523f1026d5dcd1b Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.259845 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-zfbrs"] Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.327834 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/538e9507-45ba-4586-b08d-7bbf295e47af-console-serving-cert\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.327876 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/538e9507-45ba-4586-b08d-7bbf295e47af-console-config\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.327903 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/538e9507-45ba-4586-b08d-7bbf295e47af-console-oauth-config\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.327928 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wrtj\" (UniqueName: \"kubernetes.io/projected/538e9507-45ba-4586-b08d-7bbf295e47af-kube-api-access-7wrtj\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.327964 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/538e9507-45ba-4586-b08d-7bbf295e47af-oauth-serving-cert\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.328009 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/538e9507-45ba-4586-b08d-7bbf295e47af-trusted-ca-bundle\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.328052 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/538e9507-45ba-4586-b08d-7bbf295e47af-service-ca\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.329051 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/538e9507-45ba-4586-b08d-7bbf295e47af-console-config\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.329337 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/538e9507-45ba-4586-b08d-7bbf295e47af-service-ca\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.332917 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/538e9507-45ba-4586-b08d-7bbf295e47af-console-serving-cert\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.333969 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/538e9507-45ba-4586-b08d-7bbf295e47af-oauth-serving-cert\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.334545 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/538e9507-45ba-4586-b08d-7bbf295e47af-console-oauth-config\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.335604 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/538e9507-45ba-4586-b08d-7bbf295e47af-trusted-ca-bundle\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.353054 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wrtj\" (UniqueName: \"kubernetes.io/projected/538e9507-45ba-4586-b08d-7bbf295e47af-kube-api-access-7wrtj\") pod \"console-67bd66fdcb-4l9c5\" (UID: \"538e9507-45ba-4586-b08d-7bbf295e47af\") " pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.368264 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.399219 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.432000 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4e68cb-70a1-40bd-815d-e35e0a3337a0-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-6t69s\" (UID: \"3f4e68cb-70a1-40bd-815d-e35e0a3337a0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.438280 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4e68cb-70a1-40bd-815d-e35e0a3337a0-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-6t69s\" (UID: \"3f4e68cb-70a1-40bd-815d-e35e0a3337a0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.512358 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.534049 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-catalog-content\") pod \"3a26cc0d-3ca6-4fa4-98dc-554493370fa6\" (UID: \"3a26cc0d-3ca6-4fa4-98dc-554493370fa6\") " Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.534128 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l7cp\" (UniqueName: \"kubernetes.io/projected/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-kube-api-access-9l7cp\") pod \"3a26cc0d-3ca6-4fa4-98dc-554493370fa6\" (UID: \"3a26cc0d-3ca6-4fa4-98dc-554493370fa6\") " Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.534210 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-utilities\") pod \"3a26cc0d-3ca6-4fa4-98dc-554493370fa6\" (UID: \"3a26cc0d-3ca6-4fa4-98dc-554493370fa6\") " Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.536913 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-utilities" (OuterVolumeSpecName: "utilities") pod "3a26cc0d-3ca6-4fa4-98dc-554493370fa6" (UID: "3a26cc0d-3ca6-4fa4-98dc-554493370fa6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.541894 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-kube-api-access-9l7cp" (OuterVolumeSpecName: "kube-api-access-9l7cp") pod "3a26cc0d-3ca6-4fa4-98dc-554493370fa6" (UID: "3a26cc0d-3ca6-4fa4-98dc-554493370fa6"). InnerVolumeSpecName "kube-api-access-9l7cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.584446 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67bd66fdcb-4l9c5"] Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.587696 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-ftlkm" event={"ID":"c458e5d3-8c0a-4135-aba8-54854b16c411","Type":"ContainerStarted","Data":"0c597b7acfcee0073adeac9ccdeb0a9d56761a3e7842416d0b8e7a1d7d493f56"} Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.599450 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zfbrs" event={"ID":"1b699e53-ece3-49d9-9f68-c3558aef7892","Type":"ContainerStarted","Data":"aab93662c099fd9e50b153d430546dab33154792ff3e19761523f1026d5dcd1b"} Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.602410 4580 generic.go:334] "Generic (PLEG): container finished" podID="3a26cc0d-3ca6-4fa4-98dc-554493370fa6" containerID="1682e41c7116ca20a7f2ba051d71fd99dfe7ab84bde7de6834a127917d448015" exitCode=0 Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.602468 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwmlz" event={"ID":"3a26cc0d-3ca6-4fa4-98dc-554493370fa6","Type":"ContainerDied","Data":"1682e41c7116ca20a7f2ba051d71fd99dfe7ab84bde7de6834a127917d448015"} Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.602507 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwmlz" event={"ID":"3a26cc0d-3ca6-4fa4-98dc-554493370fa6","Type":"ContainerDied","Data":"ac7726ed82c569259f45fec1fe7bbce9fcd3f777bf0b71767b525aba430c3865"} Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.602552 4580 scope.go:117] "RemoveContainer" containerID="1682e41c7116ca20a7f2ba051d71fd99dfe7ab84bde7de6834a127917d448015" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.602771 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwmlz" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.636930 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l7cp\" (UniqueName: \"kubernetes.io/projected/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-kube-api-access-9l7cp\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.636961 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.706056 4580 scope.go:117] "RemoveContainer" containerID="e0d365545a5a41bdd8a37b83317857b41abd1c3b7526a9b685004a98ff6c5b2f" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.737423 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.738369 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a26cc0d-3ca6-4fa4-98dc-554493370fa6" (UID: "3a26cc0d-3ca6-4fa4-98dc-554493370fa6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.753409 4580 scope.go:117] "RemoveContainer" containerID="a6a230cfb7f1ecb7f1f72884b0fcd43aae99b2a5b8769d80f7163b7b3005639a" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.840544 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a26cc0d-3ca6-4fa4-98dc-554493370fa6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.843373 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf"] Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.851823 4580 scope.go:117] "RemoveContainer" containerID="1682e41c7116ca20a7f2ba051d71fd99dfe7ab84bde7de6834a127917d448015" Mar 21 05:06:17 crc kubenswrapper[4580]: E0321 05:06:17.854509 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1682e41c7116ca20a7f2ba051d71fd99dfe7ab84bde7de6834a127917d448015\": container with ID starting with 1682e41c7116ca20a7f2ba051d71fd99dfe7ab84bde7de6834a127917d448015 not found: ID does not exist" containerID="1682e41c7116ca20a7f2ba051d71fd99dfe7ab84bde7de6834a127917d448015" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.854584 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1682e41c7116ca20a7f2ba051d71fd99dfe7ab84bde7de6834a127917d448015"} err="failed to get container status \"1682e41c7116ca20a7f2ba051d71fd99dfe7ab84bde7de6834a127917d448015\": rpc error: code = NotFound desc = could not find container \"1682e41c7116ca20a7f2ba051d71fd99dfe7ab84bde7de6834a127917d448015\": container with ID starting with 1682e41c7116ca20a7f2ba051d71fd99dfe7ab84bde7de6834a127917d448015 not found: ID does not exist" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.854632 4580 scope.go:117] "RemoveContainer" containerID="e0d365545a5a41bdd8a37b83317857b41abd1c3b7526a9b685004a98ff6c5b2f" Mar 21 05:06:17 crc kubenswrapper[4580]: E0321 05:06:17.856068 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0d365545a5a41bdd8a37b83317857b41abd1c3b7526a9b685004a98ff6c5b2f\": container with ID starting with e0d365545a5a41bdd8a37b83317857b41abd1c3b7526a9b685004a98ff6c5b2f not found: ID does not exist" containerID="e0d365545a5a41bdd8a37b83317857b41abd1c3b7526a9b685004a98ff6c5b2f" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.856126 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d365545a5a41bdd8a37b83317857b41abd1c3b7526a9b685004a98ff6c5b2f"} err="failed to get container status \"e0d365545a5a41bdd8a37b83317857b41abd1c3b7526a9b685004a98ff6c5b2f\": rpc error: code = NotFound desc = could not find container \"e0d365545a5a41bdd8a37b83317857b41abd1c3b7526a9b685004a98ff6c5b2f\": container with ID starting with e0d365545a5a41bdd8a37b83317857b41abd1c3b7526a9b685004a98ff6c5b2f not found: ID does not exist" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.856171 4580 scope.go:117] "RemoveContainer" containerID="a6a230cfb7f1ecb7f1f72884b0fcd43aae99b2a5b8769d80f7163b7b3005639a" Mar 21 05:06:17 crc kubenswrapper[4580]: E0321 05:06:17.856908 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6a230cfb7f1ecb7f1f72884b0fcd43aae99b2a5b8769d80f7163b7b3005639a\": container with ID starting with a6a230cfb7f1ecb7f1f72884b0fcd43aae99b2a5b8769d80f7163b7b3005639a not found: ID does not exist" containerID="a6a230cfb7f1ecb7f1f72884b0fcd43aae99b2a5b8769d80f7163b7b3005639a" Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.857015 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6a230cfb7f1ecb7f1f72884b0fcd43aae99b2a5b8769d80f7163b7b3005639a"} err="failed to get container status \"a6a230cfb7f1ecb7f1f72884b0fcd43aae99b2a5b8769d80f7163b7b3005639a\": rpc error: code = NotFound desc = could not find container \"a6a230cfb7f1ecb7f1f72884b0fcd43aae99b2a5b8769d80f7163b7b3005639a\": container with ID starting with a6a230cfb7f1ecb7f1f72884b0fcd43aae99b2a5b8769d80f7163b7b3005639a not found: ID does not exist" Mar 21 05:06:17 crc kubenswrapper[4580]: W0321 05:06:17.861197 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod449ae922_f55a_437e_b18a_d6e2700cc02e.slice/crio-53cd9371036020c4590c3d727d1eb69dc1cb64f196a46679397b3983f312abe5 WatchSource:0}: Error finding container 53cd9371036020c4590c3d727d1eb69dc1cb64f196a46679397b3983f312abe5: Status 404 returned error can't find the container with id 53cd9371036020c4590c3d727d1eb69dc1cb64f196a46679397b3983f312abe5 Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.951387 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vwmlz"] Mar 21 05:06:17 crc kubenswrapper[4580]: I0321 05:06:17.960329 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vwmlz"] Mar 21 05:06:18 crc kubenswrapper[4580]: I0321 05:06:18.225718 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s"] Mar 21 05:06:18 crc kubenswrapper[4580]: I0321 05:06:18.608274 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf" event={"ID":"449ae922-f55a-437e-b18a-d6e2700cc02e","Type":"ContainerStarted","Data":"53cd9371036020c4590c3d727d1eb69dc1cb64f196a46679397b3983f312abe5"} Mar 21 05:06:18 crc kubenswrapper[4580]: I0321 05:06:18.610370 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67bd66fdcb-4l9c5" event={"ID":"538e9507-45ba-4586-b08d-7bbf295e47af","Type":"ContainerStarted","Data":"e3ea281dd9f19f05760636e83a9083219ec01b7c48e578e7bc669dca18b2805c"} Mar 21 05:06:18 crc kubenswrapper[4580]: I0321 05:06:18.610396 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67bd66fdcb-4l9c5" event={"ID":"538e9507-45ba-4586-b08d-7bbf295e47af","Type":"ContainerStarted","Data":"c4e9b69974190cdbf54e16bd34cc465e2501c36d26c9f451e72158217ca4ba68"} Mar 21 05:06:18 crc kubenswrapper[4580]: I0321 05:06:18.612632 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s" event={"ID":"3f4e68cb-70a1-40bd-815d-e35e0a3337a0","Type":"ContainerStarted","Data":"f9670261357ad55ee37acb6e5457e4a58265d5d1b3213224f0200439c50ce1ba"} Mar 21 05:06:18 crc kubenswrapper[4580]: I0321 05:06:18.633027 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67bd66fdcb-4l9c5" podStartSLOduration=1.633003386 podStartE2EDuration="1.633003386s" podCreationTimestamp="2026-03-21 05:06:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:06:18.630461309 +0000 UTC m=+883.713044957" watchObservedRunningTime="2026-03-21 05:06:18.633003386 +0000 UTC m=+883.715587014" Mar 21 05:06:19 crc kubenswrapper[4580]: I0321 05:06:19.624743 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a26cc0d-3ca6-4fa4-98dc-554493370fa6" path="/var/lib/kubelet/pods/3a26cc0d-3ca6-4fa4-98dc-554493370fa6/volumes" Mar 21 05:06:20 crc kubenswrapper[4580]: I0321 05:06:20.635154 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-ftlkm" event={"ID":"c458e5d3-8c0a-4135-aba8-54854b16c411","Type":"ContainerStarted","Data":"3ea63b6ebc94216baa3ab634cfb5e9d3032a95c95bbec966f2a5dd725f7d1761"} Mar 21 05:06:20 crc kubenswrapper[4580]: I0321 05:06:20.635663 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:20 crc kubenswrapper[4580]: I0321 05:06:20.638467 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf" event={"ID":"449ae922-f55a-437e-b18a-d6e2700cc02e","Type":"ContainerStarted","Data":"e5b5233673762f37ad24ea7c978d53e7c3f67d7faca05a63b2bb1012a7c13b29"} Mar 21 05:06:20 crc kubenswrapper[4580]: I0321 05:06:20.639014 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf" Mar 21 05:06:20 crc kubenswrapper[4580]: I0321 05:06:20.641571 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zfbrs" event={"ID":"1b699e53-ece3-49d9-9f68-c3558aef7892","Type":"ContainerStarted","Data":"84133ba1f68e87a443458099510caf634b4e7d026f907592548a1cc875fd63ec"} Mar 21 05:06:20 crc kubenswrapper[4580]: I0321 05:06:20.681434 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf" podStartSLOduration=2.413678785 podStartE2EDuration="4.681409629s" podCreationTimestamp="2026-03-21 05:06:16 +0000 UTC" firstStartedPulling="2026-03-21 05:06:17.866169936 +0000 UTC m=+882.948753564" lastFinishedPulling="2026-03-21 05:06:20.13390078 +0000 UTC m=+885.216484408" observedRunningTime="2026-03-21 05:06:20.678096202 +0000 UTC m=+885.760679840" watchObservedRunningTime="2026-03-21 05:06:20.681409629 +0000 UTC m=+885.763993267" Mar 21 05:06:20 crc kubenswrapper[4580]: I0321 05:06:20.687893 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-ftlkm" podStartSLOduration=1.565416803 podStartE2EDuration="4.68787151s" podCreationTimestamp="2026-03-21 05:06:16 +0000 UTC" firstStartedPulling="2026-03-21 05:06:17.002472565 +0000 UTC m=+882.085056193" lastFinishedPulling="2026-03-21 05:06:20.124927272 +0000 UTC m=+885.207510900" observedRunningTime="2026-03-21 05:06:20.660146067 +0000 UTC m=+885.742729705" watchObservedRunningTime="2026-03-21 05:06:20.68787151 +0000 UTC m=+885.770455138" Mar 21 05:06:21 crc kubenswrapper[4580]: I0321 05:06:21.656108 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s" event={"ID":"3f4e68cb-70a1-40bd-815d-e35e0a3337a0","Type":"ContainerStarted","Data":"f84027eaae4d6016d44dfaa47e8b9a94e388135f102b2772b35060a6dc9d5397"} Mar 21 05:06:21 crc kubenswrapper[4580]: I0321 05:06:21.693597 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6t69s" podStartSLOduration=2.6672052 podStartE2EDuration="5.693564356s" podCreationTimestamp="2026-03-21 05:06:16 +0000 UTC" firstStartedPulling="2026-03-21 05:06:18.237337772 +0000 UTC m=+883.319921420" lastFinishedPulling="2026-03-21 05:06:21.263696948 +0000 UTC m=+886.346280576" observedRunningTime="2026-03-21 05:06:21.686359925 +0000 UTC m=+886.768943553" watchObservedRunningTime="2026-03-21 05:06:21.693564356 +0000 UTC m=+886.776147994" Mar 21 05:06:23 crc kubenswrapper[4580]: I0321 05:06:23.671401 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zfbrs" event={"ID":"1b699e53-ece3-49d9-9f68-c3558aef7892","Type":"ContainerStarted","Data":"67b2ec55c55979ec6a1261e6fccba06df69301d856a4bb46008e054039ed997b"} Mar 21 05:06:23 crc kubenswrapper[4580]: I0321 05:06:23.692910 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zfbrs" podStartSLOduration=2.20507162 podStartE2EDuration="7.692868291s" podCreationTimestamp="2026-03-21 05:06:16 +0000 UTC" firstStartedPulling="2026-03-21 05:06:17.262281476 +0000 UTC m=+882.344865104" lastFinishedPulling="2026-03-21 05:06:22.750078147 +0000 UTC m=+887.832661775" observedRunningTime="2026-03-21 05:06:23.690255571 +0000 UTC m=+888.772839219" watchObservedRunningTime="2026-03-21 05:06:23.692868291 +0000 UTC m=+888.775451919" Mar 21 05:06:26 crc kubenswrapper[4580]: I0321 05:06:26.987709 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-ftlkm" Mar 21 05:06:27 crc kubenswrapper[4580]: I0321 05:06:27.370157 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:27 crc kubenswrapper[4580]: I0321 05:06:27.370247 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:27 crc kubenswrapper[4580]: I0321 05:06:27.376489 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:27 crc kubenswrapper[4580]: I0321 05:06:27.703825 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67bd66fdcb-4l9c5" Mar 21 05:06:27 crc kubenswrapper[4580]: I0321 05:06:27.771490 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-48dqz"] Mar 21 05:06:37 crc kubenswrapper[4580]: I0321 05:06:37.520937 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w4mjf" Mar 21 05:06:45 crc kubenswrapper[4580]: I0321 05:06:45.948062 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:06:45 crc kubenswrapper[4580]: I0321 05:06:45.948834 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.412927 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv"] Mar 21 05:06:50 crc kubenswrapper[4580]: E0321 05:06:50.413910 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a26cc0d-3ca6-4fa4-98dc-554493370fa6" containerName="extract-utilities" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.413931 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a26cc0d-3ca6-4fa4-98dc-554493370fa6" containerName="extract-utilities" Mar 21 05:06:50 crc kubenswrapper[4580]: E0321 05:06:50.413944 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a26cc0d-3ca6-4fa4-98dc-554493370fa6" containerName="extract-content" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.413950 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a26cc0d-3ca6-4fa4-98dc-554493370fa6" containerName="extract-content" Mar 21 05:06:50 crc kubenswrapper[4580]: E0321 05:06:50.413967 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a26cc0d-3ca6-4fa4-98dc-554493370fa6" containerName="registry-server" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.413975 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a26cc0d-3ca6-4fa4-98dc-554493370fa6" containerName="registry-server" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.414144 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a26cc0d-3ca6-4fa4-98dc-554493370fa6" containerName="registry-server" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.415252 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.420888 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.427130 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv"] Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.504503 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66538d86-0387-4c8b-b266-4ee60f1cbd90-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv\" (UID: \"66538d86-0387-4c8b-b266-4ee60f1cbd90\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.504625 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66538d86-0387-4c8b-b266-4ee60f1cbd90-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv\" (UID: \"66538d86-0387-4c8b-b266-4ee60f1cbd90\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.504649 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbhqf\" (UniqueName: \"kubernetes.io/projected/66538d86-0387-4c8b-b266-4ee60f1cbd90-kube-api-access-cbhqf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv\" (UID: \"66538d86-0387-4c8b-b266-4ee60f1cbd90\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.606464 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66538d86-0387-4c8b-b266-4ee60f1cbd90-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv\" (UID: \"66538d86-0387-4c8b-b266-4ee60f1cbd90\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.606574 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66538d86-0387-4c8b-b266-4ee60f1cbd90-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv\" (UID: \"66538d86-0387-4c8b-b266-4ee60f1cbd90\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.606595 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbhqf\" (UniqueName: \"kubernetes.io/projected/66538d86-0387-4c8b-b266-4ee60f1cbd90-kube-api-access-cbhqf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv\" (UID: \"66538d86-0387-4c8b-b266-4ee60f1cbd90\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.607175 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66538d86-0387-4c8b-b266-4ee60f1cbd90-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv\" (UID: \"66538d86-0387-4c8b-b266-4ee60f1cbd90\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.607265 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66538d86-0387-4c8b-b266-4ee60f1cbd90-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv\" (UID: \"66538d86-0387-4c8b-b266-4ee60f1cbd90\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.634978 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbhqf\" (UniqueName: \"kubernetes.io/projected/66538d86-0387-4c8b-b266-4ee60f1cbd90-kube-api-access-cbhqf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv\" (UID: \"66538d86-0387-4c8b-b266-4ee60f1cbd90\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.734066 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" Mar 21 05:06:50 crc kubenswrapper[4580]: I0321 05:06:50.988630 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv"] Mar 21 05:06:51 crc kubenswrapper[4580]: I0321 05:06:51.861299 4580 scope.go:117] "RemoveContainer" containerID="3d82b86ba408bf8ebe802a0ff317fb66ac3dc6f003327d8ca90a9d47db056de8" Mar 21 05:06:51 crc kubenswrapper[4580]: I0321 05:06:51.907384 4580 generic.go:334] "Generic (PLEG): container finished" podID="66538d86-0387-4c8b-b266-4ee60f1cbd90" containerID="7442c69bf363bd194909b3f8029095a72ddcb365e911579dfc1d2c3392e24122" exitCode=0 Mar 21 05:06:51 crc kubenswrapper[4580]: I0321 05:06:51.907502 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" event={"ID":"66538d86-0387-4c8b-b266-4ee60f1cbd90","Type":"ContainerDied","Data":"7442c69bf363bd194909b3f8029095a72ddcb365e911579dfc1d2c3392e24122"} Mar 21 05:06:51 crc kubenswrapper[4580]: I0321 05:06:51.907600 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" event={"ID":"66538d86-0387-4c8b-b266-4ee60f1cbd90","Type":"ContainerStarted","Data":"91237b0f828dd99a3d805d6dbe19bf8a58f489b40c4afe5ef2aff024a1672b06"} Mar 21 05:06:52 crc kubenswrapper[4580]: I0321 05:06:52.819629 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-48dqz" podUID="bcade120-6711-4045-9149-08985699febd" containerName="console" containerID="cri-o://3bbe4a9cc386beb963f743da2e4139c3d7607ebb625d17066b6601f480d8ee82" gracePeriod=15 Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.213179 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-48dqz_bcade120-6711-4045-9149-08985699febd/console/0.log" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.213255 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-48dqz" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.348140 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bcade120-6711-4045-9149-08985699febd" (UID: "bcade120-6711-4045-9149-08985699febd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.347056 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-oauth-serving-cert\") pod \"bcade120-6711-4045-9149-08985699febd\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.348350 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-service-ca\") pod \"bcade120-6711-4045-9149-08985699febd\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.348384 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-trusted-ca-bundle\") pod \"bcade120-6711-4045-9149-08985699febd\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.348910 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bcade120-6711-4045-9149-08985699febd-console-oauth-config\") pod \"bcade120-6711-4045-9149-08985699febd\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.348947 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcade120-6711-4045-9149-08985699febd-console-serving-cert\") pod \"bcade120-6711-4045-9149-08985699febd\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.349035 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q5sw\" (UniqueName: \"kubernetes.io/projected/bcade120-6711-4045-9149-08985699febd-kube-api-access-8q5sw\") pod \"bcade120-6711-4045-9149-08985699febd\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.349250 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-console-config\") pod \"bcade120-6711-4045-9149-08985699febd\" (UID: \"bcade120-6711-4045-9149-08985699febd\") " Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.349705 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-service-ca" (OuterVolumeSpecName: "service-ca") pod "bcade120-6711-4045-9149-08985699febd" (UID: "bcade120-6711-4045-9149-08985699febd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.349760 4580 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.350161 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bcade120-6711-4045-9149-08985699febd" (UID: "bcade120-6711-4045-9149-08985699febd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.350209 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-console-config" (OuterVolumeSpecName: "console-config") pod "bcade120-6711-4045-9149-08985699febd" (UID: "bcade120-6711-4045-9149-08985699febd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.358432 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcade120-6711-4045-9149-08985699febd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bcade120-6711-4045-9149-08985699febd" (UID: "bcade120-6711-4045-9149-08985699febd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.358840 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcade120-6711-4045-9149-08985699febd-kube-api-access-8q5sw" (OuterVolumeSpecName: "kube-api-access-8q5sw") pod "bcade120-6711-4045-9149-08985699febd" (UID: "bcade120-6711-4045-9149-08985699febd"). InnerVolumeSpecName "kube-api-access-8q5sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.364377 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcade120-6711-4045-9149-08985699febd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bcade120-6711-4045-9149-08985699febd" (UID: "bcade120-6711-4045-9149-08985699febd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.452308 4580 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bcade120-6711-4045-9149-08985699febd-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.452360 4580 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcade120-6711-4045-9149-08985699febd-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.452373 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q5sw\" (UniqueName: \"kubernetes.io/projected/bcade120-6711-4045-9149-08985699febd-kube-api-access-8q5sw\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.452385 4580 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-console-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.452397 4580 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.452409 4580 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcade120-6711-4045-9149-08985699febd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.927428 4580 generic.go:334] "Generic (PLEG): container finished" podID="66538d86-0387-4c8b-b266-4ee60f1cbd90" containerID="02ef3c935dd2b9e937f3140449180398511fde1f4e4e6919e211df9bf8b204ed" exitCode=0 Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.927549 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" event={"ID":"66538d86-0387-4c8b-b266-4ee60f1cbd90","Type":"ContainerDied","Data":"02ef3c935dd2b9e937f3140449180398511fde1f4e4e6919e211df9bf8b204ed"} Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.939178 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-48dqz_bcade120-6711-4045-9149-08985699febd/console/0.log" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.939259 4580 generic.go:334] "Generic (PLEG): container finished" podID="bcade120-6711-4045-9149-08985699febd" containerID="3bbe4a9cc386beb963f743da2e4139c3d7607ebb625d17066b6601f480d8ee82" exitCode=2 Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.939390 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-48dqz" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.939360 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-48dqz" event={"ID":"bcade120-6711-4045-9149-08985699febd","Type":"ContainerDied","Data":"3bbe4a9cc386beb963f743da2e4139c3d7607ebb625d17066b6601f480d8ee82"} Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.939622 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-48dqz" event={"ID":"bcade120-6711-4045-9149-08985699febd","Type":"ContainerDied","Data":"95fffe63db875a2bf780f900dccbde29370cf2f478e92b558183ea6a529d0890"} Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.939670 4580 scope.go:117] "RemoveContainer" containerID="3bbe4a9cc386beb963f743da2e4139c3d7607ebb625d17066b6601f480d8ee82" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.972146 4580 scope.go:117] "RemoveContainer" containerID="3bbe4a9cc386beb963f743da2e4139c3d7607ebb625d17066b6601f480d8ee82" Mar 21 05:06:53 crc kubenswrapper[4580]: E0321 05:06:53.975259 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bbe4a9cc386beb963f743da2e4139c3d7607ebb625d17066b6601f480d8ee82\": container with ID starting with 3bbe4a9cc386beb963f743da2e4139c3d7607ebb625d17066b6601f480d8ee82 not found: ID does not exist" containerID="3bbe4a9cc386beb963f743da2e4139c3d7607ebb625d17066b6601f480d8ee82" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.975415 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bbe4a9cc386beb963f743da2e4139c3d7607ebb625d17066b6601f480d8ee82"} err="failed to get container status \"3bbe4a9cc386beb963f743da2e4139c3d7607ebb625d17066b6601f480d8ee82\": rpc error: code = NotFound desc = could not find container \"3bbe4a9cc386beb963f743da2e4139c3d7607ebb625d17066b6601f480d8ee82\": container with ID starting with 3bbe4a9cc386beb963f743da2e4139c3d7607ebb625d17066b6601f480d8ee82 not found: ID does not exist" Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.989162 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-48dqz"] Mar 21 05:06:53 crc kubenswrapper[4580]: I0321 05:06:53.993412 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-48dqz"] Mar 21 05:06:54 crc kubenswrapper[4580]: I0321 05:06:54.948091 4580 generic.go:334] "Generic (PLEG): container finished" podID="66538d86-0387-4c8b-b266-4ee60f1cbd90" containerID="2dbe56a09eabdf5bcac84eb1dfddcdd9b7394ceb98abf4a8fdd103b2ff273dc5" exitCode=0 Mar 21 05:06:54 crc kubenswrapper[4580]: I0321 05:06:54.948395 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" event={"ID":"66538d86-0387-4c8b-b266-4ee60f1cbd90","Type":"ContainerDied","Data":"2dbe56a09eabdf5bcac84eb1dfddcdd9b7394ceb98abf4a8fdd103b2ff273dc5"} Mar 21 05:06:55 crc kubenswrapper[4580]: I0321 05:06:55.628998 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcade120-6711-4045-9149-08985699febd" path="/var/lib/kubelet/pods/bcade120-6711-4045-9149-08985699febd/volumes" Mar 21 05:06:56 crc kubenswrapper[4580]: I0321 05:06:56.171517 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" Mar 21 05:06:56 crc kubenswrapper[4580]: I0321 05:06:56.300979 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66538d86-0387-4c8b-b266-4ee60f1cbd90-util\") pod \"66538d86-0387-4c8b-b266-4ee60f1cbd90\" (UID: \"66538d86-0387-4c8b-b266-4ee60f1cbd90\") " Mar 21 05:06:56 crc kubenswrapper[4580]: I0321 05:06:56.301079 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbhqf\" (UniqueName: \"kubernetes.io/projected/66538d86-0387-4c8b-b266-4ee60f1cbd90-kube-api-access-cbhqf\") pod \"66538d86-0387-4c8b-b266-4ee60f1cbd90\" (UID: \"66538d86-0387-4c8b-b266-4ee60f1cbd90\") " Mar 21 05:06:56 crc kubenswrapper[4580]: I0321 05:06:56.301146 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66538d86-0387-4c8b-b266-4ee60f1cbd90-bundle\") pod \"66538d86-0387-4c8b-b266-4ee60f1cbd90\" (UID: \"66538d86-0387-4c8b-b266-4ee60f1cbd90\") " Mar 21 05:06:56 crc kubenswrapper[4580]: I0321 05:06:56.302227 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66538d86-0387-4c8b-b266-4ee60f1cbd90-bundle" (OuterVolumeSpecName: "bundle") pod "66538d86-0387-4c8b-b266-4ee60f1cbd90" (UID: "66538d86-0387-4c8b-b266-4ee60f1cbd90"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:06:56 crc kubenswrapper[4580]: I0321 05:06:56.306525 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66538d86-0387-4c8b-b266-4ee60f1cbd90-kube-api-access-cbhqf" (OuterVolumeSpecName: "kube-api-access-cbhqf") pod "66538d86-0387-4c8b-b266-4ee60f1cbd90" (UID: "66538d86-0387-4c8b-b266-4ee60f1cbd90"). InnerVolumeSpecName "kube-api-access-cbhqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:56 crc kubenswrapper[4580]: I0321 05:06:56.315934 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66538d86-0387-4c8b-b266-4ee60f1cbd90-util" (OuterVolumeSpecName: "util") pod "66538d86-0387-4c8b-b266-4ee60f1cbd90" (UID: "66538d86-0387-4c8b-b266-4ee60f1cbd90"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:06:56 crc kubenswrapper[4580]: I0321 05:06:56.402596 4580 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66538d86-0387-4c8b-b266-4ee60f1cbd90-util\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:56 crc kubenswrapper[4580]: I0321 05:06:56.402640 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbhqf\" (UniqueName: \"kubernetes.io/projected/66538d86-0387-4c8b-b266-4ee60f1cbd90-kube-api-access-cbhqf\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:56 crc kubenswrapper[4580]: I0321 05:06:56.402651 4580 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66538d86-0387-4c8b-b266-4ee60f1cbd90-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:56 crc kubenswrapper[4580]: I0321 05:06:56.967444 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" event={"ID":"66538d86-0387-4c8b-b266-4ee60f1cbd90","Type":"ContainerDied","Data":"91237b0f828dd99a3d805d6dbe19bf8a58f489b40c4afe5ef2aff024a1672b06"} Mar 21 05:06:56 crc kubenswrapper[4580]: I0321 05:06:56.967509 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91237b0f828dd99a3d805d6dbe19bf8a58f489b40c4afe5ef2aff024a1672b06" Mar 21 05:06:56 crc kubenswrapper[4580]: I0321 05:06:56.967520 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.679328 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw"] Mar 21 05:07:05 crc kubenswrapper[4580]: E0321 05:07:05.680456 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66538d86-0387-4c8b-b266-4ee60f1cbd90" containerName="util" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.680476 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="66538d86-0387-4c8b-b266-4ee60f1cbd90" containerName="util" Mar 21 05:07:05 crc kubenswrapper[4580]: E0321 05:07:05.680496 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66538d86-0387-4c8b-b266-4ee60f1cbd90" containerName="pull" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.680503 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="66538d86-0387-4c8b-b266-4ee60f1cbd90" containerName="pull" Mar 21 05:07:05 crc kubenswrapper[4580]: E0321 05:07:05.680516 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcade120-6711-4045-9149-08985699febd" containerName="console" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.680524 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcade120-6711-4045-9149-08985699febd" containerName="console" Mar 21 05:07:05 crc kubenswrapper[4580]: E0321 05:07:05.680534 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66538d86-0387-4c8b-b266-4ee60f1cbd90" containerName="extract" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.680541 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="66538d86-0387-4c8b-b266-4ee60f1cbd90" containerName="extract" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.680661 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="66538d86-0387-4c8b-b266-4ee60f1cbd90" containerName="extract" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.680686 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcade120-6711-4045-9149-08985699febd" containerName="console" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.681137 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.683974 4580 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.684594 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.684264 4580 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.689651 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.689651 4580 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-ndfmq" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.713471 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw"] Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.838561 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58dv9\" (UniqueName: \"kubernetes.io/projected/22ee4637-a40f-4200-be5a-679e0912f4cf-kube-api-access-58dv9\") pod \"metallb-operator-controller-manager-cf889bd6-5qrxw\" (UID: \"22ee4637-a40f-4200-be5a-679e0912f4cf\") " pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.838630 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22ee4637-a40f-4200-be5a-679e0912f4cf-apiservice-cert\") pod \"metallb-operator-controller-manager-cf889bd6-5qrxw\" (UID: \"22ee4637-a40f-4200-be5a-679e0912f4cf\") " pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.838678 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22ee4637-a40f-4200-be5a-679e0912f4cf-webhook-cert\") pod \"metallb-operator-controller-manager-cf889bd6-5qrxw\" (UID: \"22ee4637-a40f-4200-be5a-679e0912f4cf\") " pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.940075 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58dv9\" (UniqueName: \"kubernetes.io/projected/22ee4637-a40f-4200-be5a-679e0912f4cf-kube-api-access-58dv9\") pod \"metallb-operator-controller-manager-cf889bd6-5qrxw\" (UID: \"22ee4637-a40f-4200-be5a-679e0912f4cf\") " pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.940560 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22ee4637-a40f-4200-be5a-679e0912f4cf-apiservice-cert\") pod \"metallb-operator-controller-manager-cf889bd6-5qrxw\" (UID: \"22ee4637-a40f-4200-be5a-679e0912f4cf\") " pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.940723 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22ee4637-a40f-4200-be5a-679e0912f4cf-webhook-cert\") pod \"metallb-operator-controller-manager-cf889bd6-5qrxw\" (UID: \"22ee4637-a40f-4200-be5a-679e0912f4cf\") " pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.966473 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22ee4637-a40f-4200-be5a-679e0912f4cf-webhook-cert\") pod \"metallb-operator-controller-manager-cf889bd6-5qrxw\" (UID: \"22ee4637-a40f-4200-be5a-679e0912f4cf\") " pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.970513 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22ee4637-a40f-4200-be5a-679e0912f4cf-apiservice-cert\") pod \"metallb-operator-controller-manager-cf889bd6-5qrxw\" (UID: \"22ee4637-a40f-4200-be5a-679e0912f4cf\") " pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.976287 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf"] Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.977453 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.983260 4580 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.983467 4580 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-qnzv6" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.983515 4580 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.983747 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58dv9\" (UniqueName: \"kubernetes.io/projected/22ee4637-a40f-4200-be5a-679e0912f4cf-kube-api-access-58dv9\") pod \"metallb-operator-controller-manager-cf889bd6-5qrxw\" (UID: \"22ee4637-a40f-4200-be5a-679e0912f4cf\") " pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" Mar 21 05:07:05 crc kubenswrapper[4580]: I0321 05:07:05.999129 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" Mar 21 05:07:06 crc kubenswrapper[4580]: I0321 05:07:06.009485 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf"] Mar 21 05:07:06 crc kubenswrapper[4580]: I0321 05:07:06.043400 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d9a7806-fa8d-4106-9241-a32afafc5eb7-apiservice-cert\") pod \"metallb-operator-webhook-server-7d5b654677-cwqcf\" (UID: \"1d9a7806-fa8d-4106-9241-a32afafc5eb7\") " pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" Mar 21 05:07:06 crc kubenswrapper[4580]: I0321 05:07:06.043463 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlbhr\" (UniqueName: \"kubernetes.io/projected/1d9a7806-fa8d-4106-9241-a32afafc5eb7-kube-api-access-hlbhr\") pod \"metallb-operator-webhook-server-7d5b654677-cwqcf\" (UID: \"1d9a7806-fa8d-4106-9241-a32afafc5eb7\") " pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" Mar 21 05:07:06 crc kubenswrapper[4580]: I0321 05:07:06.043513 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d9a7806-fa8d-4106-9241-a32afafc5eb7-webhook-cert\") pod \"metallb-operator-webhook-server-7d5b654677-cwqcf\" (UID: \"1d9a7806-fa8d-4106-9241-a32afafc5eb7\") " pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" Mar 21 05:07:06 crc kubenswrapper[4580]: I0321 05:07:06.145811 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d9a7806-fa8d-4106-9241-a32afafc5eb7-apiservice-cert\") pod \"metallb-operator-webhook-server-7d5b654677-cwqcf\" (UID: \"1d9a7806-fa8d-4106-9241-a32afafc5eb7\") " pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" Mar 21 05:07:06 crc kubenswrapper[4580]: I0321 05:07:06.146205 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlbhr\" (UniqueName: \"kubernetes.io/projected/1d9a7806-fa8d-4106-9241-a32afafc5eb7-kube-api-access-hlbhr\") pod \"metallb-operator-webhook-server-7d5b654677-cwqcf\" (UID: \"1d9a7806-fa8d-4106-9241-a32afafc5eb7\") " pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" Mar 21 05:07:06 crc kubenswrapper[4580]: I0321 05:07:06.146271 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d9a7806-fa8d-4106-9241-a32afafc5eb7-webhook-cert\") pod \"metallb-operator-webhook-server-7d5b654677-cwqcf\" (UID: \"1d9a7806-fa8d-4106-9241-a32afafc5eb7\") " pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" Mar 21 05:07:06 crc kubenswrapper[4580]: I0321 05:07:06.155687 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d9a7806-fa8d-4106-9241-a32afafc5eb7-webhook-cert\") pod \"metallb-operator-webhook-server-7d5b654677-cwqcf\" (UID: \"1d9a7806-fa8d-4106-9241-a32afafc5eb7\") " pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" Mar 21 05:07:06 crc kubenswrapper[4580]: I0321 05:07:06.156259 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d9a7806-fa8d-4106-9241-a32afafc5eb7-apiservice-cert\") pod \"metallb-operator-webhook-server-7d5b654677-cwqcf\" (UID: \"1d9a7806-fa8d-4106-9241-a32afafc5eb7\") " pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" Mar 21 05:07:06 crc kubenswrapper[4580]: I0321 05:07:06.176300 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlbhr\" (UniqueName: \"kubernetes.io/projected/1d9a7806-fa8d-4106-9241-a32afafc5eb7-kube-api-access-hlbhr\") pod \"metallb-operator-webhook-server-7d5b654677-cwqcf\" (UID: \"1d9a7806-fa8d-4106-9241-a32afafc5eb7\") " pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" Mar 21 05:07:06 crc kubenswrapper[4580]: I0321 05:07:06.394654 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" Mar 21 05:07:06 crc kubenswrapper[4580]: I0321 05:07:06.787430 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw"] Mar 21 05:07:06 crc kubenswrapper[4580]: W0321 05:07:06.795342 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22ee4637_a40f_4200_be5a_679e0912f4cf.slice/crio-e4f6bb3bc0e99fca87ef11ae61a1739280109f0dbe7d8b48b37cf31817ebc126 WatchSource:0}: Error finding container e4f6bb3bc0e99fca87ef11ae61a1739280109f0dbe7d8b48b37cf31817ebc126: Status 404 returned error can't find the container with id e4f6bb3bc0e99fca87ef11ae61a1739280109f0dbe7d8b48b37cf31817ebc126 Mar 21 05:07:06 crc kubenswrapper[4580]: I0321 05:07:06.800734 4580 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:07:06 crc kubenswrapper[4580]: I0321 05:07:06.861675 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf"] Mar 21 05:07:07 crc kubenswrapper[4580]: I0321 05:07:07.047885 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" event={"ID":"22ee4637-a40f-4200-be5a-679e0912f4cf","Type":"ContainerStarted","Data":"e4f6bb3bc0e99fca87ef11ae61a1739280109f0dbe7d8b48b37cf31817ebc126"} Mar 21 05:07:07 crc kubenswrapper[4580]: I0321 05:07:07.049484 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" event={"ID":"1d9a7806-fa8d-4106-9241-a32afafc5eb7","Type":"ContainerStarted","Data":"fe5a6cabd991cbd0f057d9cdd839303c77dd283e1e458e1eae333fe17651a49b"} Mar 21 05:07:14 crc kubenswrapper[4580]: I0321 05:07:14.112679 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" event={"ID":"22ee4637-a40f-4200-be5a-679e0912f4cf","Type":"ContainerStarted","Data":"824ff8f1c3ca991b7f3e7c6410fc7ea1d76ac9eddf96cdbc6f7bff664115becc"} Mar 21 05:07:14 crc kubenswrapper[4580]: I0321 05:07:14.113455 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" Mar 21 05:07:14 crc kubenswrapper[4580]: I0321 05:07:14.115226 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" event={"ID":"1d9a7806-fa8d-4106-9241-a32afafc5eb7","Type":"ContainerStarted","Data":"b9fa3baf09bf866bb991902fc3edb5222d1726408f1f2bb822f4a1f67592eed8"} Mar 21 05:07:14 crc kubenswrapper[4580]: I0321 05:07:14.115389 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" Mar 21 05:07:14 crc kubenswrapper[4580]: I0321 05:07:14.145416 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" podStartSLOduration=2.069505002 podStartE2EDuration="9.145386762s" podCreationTimestamp="2026-03-21 05:07:05 +0000 UTC" firstStartedPulling="2026-03-21 05:07:06.800382944 +0000 UTC m=+931.882966572" lastFinishedPulling="2026-03-21 05:07:13.876264704 +0000 UTC m=+938.958848332" observedRunningTime="2026-03-21 05:07:14.136687012 +0000 UTC m=+939.219270670" watchObservedRunningTime="2026-03-21 05:07:14.145386762 +0000 UTC m=+939.227970390" Mar 21 05:07:14 crc kubenswrapper[4580]: I0321 05:07:14.171385 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" podStartSLOduration=2.149259151 podStartE2EDuration="9.171358009s" podCreationTimestamp="2026-03-21 05:07:05 +0000 UTC" firstStartedPulling="2026-03-21 05:07:06.87094443 +0000 UTC m=+931.953528048" lastFinishedPulling="2026-03-21 05:07:13.893043278 +0000 UTC m=+938.975626906" observedRunningTime="2026-03-21 05:07:14.167836696 +0000 UTC m=+939.250420334" watchObservedRunningTime="2026-03-21 05:07:14.171358009 +0000 UTC m=+939.253941637" Mar 21 05:07:15 crc kubenswrapper[4580]: I0321 05:07:15.947398 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:07:15 crc kubenswrapper[4580]: I0321 05:07:15.947459 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:07:26 crc kubenswrapper[4580]: I0321 05:07:26.404070 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7d5b654677-cwqcf" Mar 21 05:07:45 crc kubenswrapper[4580]: I0321 05:07:45.948084 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:07:45 crc kubenswrapper[4580]: I0321 05:07:45.948996 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:07:45 crc kubenswrapper[4580]: I0321 05:07:45.949065 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 05:07:45 crc kubenswrapper[4580]: I0321 05:07:45.950074 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"76a03eb87bee439fb7189493fe11b7778fb36a6c538f9c47967069b07415ab8b"} pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:07:45 crc kubenswrapper[4580]: I0321 05:07:45.950155 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" containerID="cri-o://76a03eb87bee439fb7189493fe11b7778fb36a6c538f9c47967069b07415ab8b" gracePeriod=600 Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.002613 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-cf889bd6-5qrxw" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.659851 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerID="76a03eb87bee439fb7189493fe11b7778fb36a6c538f9c47967069b07415ab8b" exitCode=0 Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.659910 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerDied","Data":"76a03eb87bee439fb7189493fe11b7778fb36a6c538f9c47967069b07415ab8b"} Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.660471 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"3ce83f011c377b22dc6fc9c4fe068d2bf2cb580d09b97baaf4fd92fe417cd5eb"} Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.660499 4580 scope.go:117] "RemoveContainer" containerID="c0eb8b838e32b2cb84a7017035087c480a43ddf5f25d7a157720d87f4ca4f069" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.756714 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl"] Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.757534 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.765571 4580 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.765917 4580 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-f25bm" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.777902 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-bzwqp"] Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.780366 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.786294 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl"] Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.797085 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.797398 4580 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.836368 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e974ab22-96b8-4617-9fcf-db94114f0b0d-frr-sockets\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.836417 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x2v7\" (UniqueName: \"kubernetes.io/projected/e974ab22-96b8-4617-9fcf-db94114f0b0d-kube-api-access-5x2v7\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.836440 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e974ab22-96b8-4617-9fcf-db94114f0b0d-reloader\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.836474 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e974ab22-96b8-4617-9fcf-db94114f0b0d-frr-startup\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.836499 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e974ab22-96b8-4617-9fcf-db94114f0b0d-metrics\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.836542 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e974ab22-96b8-4617-9fcf-db94114f0b0d-frr-conf\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.836565 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w5mn\" (UniqueName: \"kubernetes.io/projected/254226c1-a87d-4bca-a3d4-a909452fa9ac-kube-api-access-2w5mn\") pod \"frr-k8s-webhook-server-bcc4b6f68-qdhgl\" (UID: \"254226c1-a87d-4bca-a3d4-a909452fa9ac\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.836597 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/254226c1-a87d-4bca-a3d4-a909452fa9ac-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qdhgl\" (UID: \"254226c1-a87d-4bca-a3d4-a909452fa9ac\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.836614 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e974ab22-96b8-4617-9fcf-db94114f0b0d-metrics-certs\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.895369 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-smqrq"] Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.896470 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-smqrq" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.900463 4580 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.900488 4580 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-698vm" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.900680 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.900749 4580 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.921642 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-vlfhg"] Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.922995 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-vlfhg" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.927647 4580 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.938265 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e974ab22-96b8-4617-9fcf-db94114f0b0d-frr-startup\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.938331 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e974ab22-96b8-4617-9fcf-db94114f0b0d-metrics\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.938386 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e974ab22-96b8-4617-9fcf-db94114f0b0d-frr-conf\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.938419 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w5mn\" (UniqueName: \"kubernetes.io/projected/254226c1-a87d-4bca-a3d4-a909452fa9ac-kube-api-access-2w5mn\") pod \"frr-k8s-webhook-server-bcc4b6f68-qdhgl\" (UID: \"254226c1-a87d-4bca-a3d4-a909452fa9ac\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.938484 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/254226c1-a87d-4bca-a3d4-a909452fa9ac-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qdhgl\" (UID: \"254226c1-a87d-4bca-a3d4-a909452fa9ac\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.938511 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e974ab22-96b8-4617-9fcf-db94114f0b0d-metrics-certs\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.938566 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x2v7\" (UniqueName: \"kubernetes.io/projected/e974ab22-96b8-4617-9fcf-db94114f0b0d-kube-api-access-5x2v7\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.938588 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e974ab22-96b8-4617-9fcf-db94114f0b0d-frr-sockets\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.938609 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e974ab22-96b8-4617-9fcf-db94114f0b0d-reloader\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.939110 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e974ab22-96b8-4617-9fcf-db94114f0b0d-reloader\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.940061 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e974ab22-96b8-4617-9fcf-db94114f0b0d-frr-startup\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.940303 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e974ab22-96b8-4617-9fcf-db94114f0b0d-metrics\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.940532 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e974ab22-96b8-4617-9fcf-db94114f0b0d-frr-conf\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: E0321 05:07:46.941052 4580 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 21 05:07:46 crc kubenswrapper[4580]: E0321 05:07:46.941124 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/254226c1-a87d-4bca-a3d4-a909452fa9ac-cert podName:254226c1-a87d-4bca-a3d4-a909452fa9ac nodeName:}" failed. No retries permitted until 2026-03-21 05:07:47.44110068 +0000 UTC m=+972.523684318 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/254226c1-a87d-4bca-a3d4-a909452fa9ac-cert") pod "frr-k8s-webhook-server-bcc4b6f68-qdhgl" (UID: "254226c1-a87d-4bca-a3d4-a909452fa9ac") : secret "frr-k8s-webhook-server-cert" not found Mar 21 05:07:46 crc kubenswrapper[4580]: E0321 05:07:46.941370 4580 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 21 05:07:46 crc kubenswrapper[4580]: E0321 05:07:46.941409 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e974ab22-96b8-4617-9fcf-db94114f0b0d-metrics-certs podName:e974ab22-96b8-4617-9fcf-db94114f0b0d nodeName:}" failed. No retries permitted until 2026-03-21 05:07:47.441399688 +0000 UTC m=+972.523983316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e974ab22-96b8-4617-9fcf-db94114f0b0d-metrics-certs") pod "frr-k8s-bzwqp" (UID: "e974ab22-96b8-4617-9fcf-db94114f0b0d") : secret "frr-k8s-certs-secret" not found Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.941766 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e974ab22-96b8-4617-9fcf-db94114f0b0d-frr-sockets\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:46 crc kubenswrapper[4580]: I0321 05:07:46.951866 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-vlfhg"] Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:46.994626 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x2v7\" (UniqueName: \"kubernetes.io/projected/e974ab22-96b8-4617-9fcf-db94114f0b0d-kube-api-access-5x2v7\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.003551 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w5mn\" (UniqueName: \"kubernetes.io/projected/254226c1-a87d-4bca-a3d4-a909452fa9ac-kube-api-access-2w5mn\") pod \"frr-k8s-webhook-server-bcc4b6f68-qdhgl\" (UID: \"254226c1-a87d-4bca-a3d4-a909452fa9ac\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.040216 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab7c431f-e254-4c36-a240-15ec5cbb14e9-cert\") pod \"controller-7bb4cc7c98-vlfhg\" (UID: \"ab7c431f-e254-4c36-a240-15ec5cbb14e9\") " pod="metallb-system/controller-7bb4cc7c98-vlfhg" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.040308 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d2b75e08-f7c9-47c8-9b08-f574bb92461d-metallb-excludel2\") pod \"speaker-smqrq\" (UID: \"d2b75e08-f7c9-47c8-9b08-f574bb92461d\") " pod="metallb-system/speaker-smqrq" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.040365 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvv26\" (UniqueName: \"kubernetes.io/projected/d2b75e08-f7c9-47c8-9b08-f574bb92461d-kube-api-access-fvv26\") pod \"speaker-smqrq\" (UID: \"d2b75e08-f7c9-47c8-9b08-f574bb92461d\") " pod="metallb-system/speaker-smqrq" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.040425 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab7c431f-e254-4c36-a240-15ec5cbb14e9-metrics-certs\") pod \"controller-7bb4cc7c98-vlfhg\" (UID: \"ab7c431f-e254-4c36-a240-15ec5cbb14e9\") " pod="metallb-system/controller-7bb4cc7c98-vlfhg" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.040459 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2b75e08-f7c9-47c8-9b08-f574bb92461d-metrics-certs\") pod \"speaker-smqrq\" (UID: \"d2b75e08-f7c9-47c8-9b08-f574bb92461d\") " pod="metallb-system/speaker-smqrq" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.040504 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2gws\" (UniqueName: \"kubernetes.io/projected/ab7c431f-e254-4c36-a240-15ec5cbb14e9-kube-api-access-b2gws\") pod \"controller-7bb4cc7c98-vlfhg\" (UID: \"ab7c431f-e254-4c36-a240-15ec5cbb14e9\") " pod="metallb-system/controller-7bb4cc7c98-vlfhg" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.040541 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d2b75e08-f7c9-47c8-9b08-f574bb92461d-memberlist\") pod \"speaker-smqrq\" (UID: \"d2b75e08-f7c9-47c8-9b08-f574bb92461d\") " pod="metallb-system/speaker-smqrq" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.141627 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d2b75e08-f7c9-47c8-9b08-f574bb92461d-metallb-excludel2\") pod \"speaker-smqrq\" (UID: \"d2b75e08-f7c9-47c8-9b08-f574bb92461d\") " pod="metallb-system/speaker-smqrq" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.141730 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvv26\" (UniqueName: \"kubernetes.io/projected/d2b75e08-f7c9-47c8-9b08-f574bb92461d-kube-api-access-fvv26\") pod \"speaker-smqrq\" (UID: \"d2b75e08-f7c9-47c8-9b08-f574bb92461d\") " pod="metallb-system/speaker-smqrq" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.142159 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab7c431f-e254-4c36-a240-15ec5cbb14e9-metrics-certs\") pod \"controller-7bb4cc7c98-vlfhg\" (UID: \"ab7c431f-e254-4c36-a240-15ec5cbb14e9\") " pod="metallb-system/controller-7bb4cc7c98-vlfhg" Mar 21 05:07:47 crc kubenswrapper[4580]: E0321 05:07:47.142259 4580 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 21 05:07:47 crc kubenswrapper[4580]: E0321 05:07:47.142317 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab7c431f-e254-4c36-a240-15ec5cbb14e9-metrics-certs podName:ab7c431f-e254-4c36-a240-15ec5cbb14e9 nodeName:}" failed. No retries permitted until 2026-03-21 05:07:47.64229405 +0000 UTC m=+972.724877678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab7c431f-e254-4c36-a240-15ec5cbb14e9-metrics-certs") pod "controller-7bb4cc7c98-vlfhg" (UID: "ab7c431f-e254-4c36-a240-15ec5cbb14e9") : secret "controller-certs-secret" not found Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.142498 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2b75e08-f7c9-47c8-9b08-f574bb92461d-metrics-certs\") pod \"speaker-smqrq\" (UID: \"d2b75e08-f7c9-47c8-9b08-f574bb92461d\") " pod="metallb-system/speaker-smqrq" Mar 21 05:07:47 crc kubenswrapper[4580]: E0321 05:07:47.142597 4580 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 21 05:07:47 crc kubenswrapper[4580]: E0321 05:07:47.142629 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2b75e08-f7c9-47c8-9b08-f574bb92461d-metrics-certs podName:d2b75e08-f7c9-47c8-9b08-f574bb92461d nodeName:}" failed. No retries permitted until 2026-03-21 05:07:47.642619539 +0000 UTC m=+972.725203167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2b75e08-f7c9-47c8-9b08-f574bb92461d-metrics-certs") pod "speaker-smqrq" (UID: "d2b75e08-f7c9-47c8-9b08-f574bb92461d") : secret "speaker-certs-secret" not found Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.142637 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d2b75e08-f7c9-47c8-9b08-f574bb92461d-metallb-excludel2\") pod \"speaker-smqrq\" (UID: \"d2b75e08-f7c9-47c8-9b08-f574bb92461d\") " pod="metallb-system/speaker-smqrq" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.142647 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2gws\" (UniqueName: \"kubernetes.io/projected/ab7c431f-e254-4c36-a240-15ec5cbb14e9-kube-api-access-b2gws\") pod \"controller-7bb4cc7c98-vlfhg\" (UID: \"ab7c431f-e254-4c36-a240-15ec5cbb14e9\") " pod="metallb-system/controller-7bb4cc7c98-vlfhg" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.142771 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d2b75e08-f7c9-47c8-9b08-f574bb92461d-memberlist\") pod \"speaker-smqrq\" (UID: \"d2b75e08-f7c9-47c8-9b08-f574bb92461d\") " pod="metallb-system/speaker-smqrq" Mar 21 05:07:47 crc kubenswrapper[4580]: E0321 05:07:47.142841 4580 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 21 05:07:47 crc kubenswrapper[4580]: E0321 05:07:47.142869 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2b75e08-f7c9-47c8-9b08-f574bb92461d-memberlist podName:d2b75e08-f7c9-47c8-9b08-f574bb92461d nodeName:}" failed. No retries permitted until 2026-03-21 05:07:47.642862685 +0000 UTC m=+972.725446303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d2b75e08-f7c9-47c8-9b08-f574bb92461d-memberlist") pod "speaker-smqrq" (UID: "d2b75e08-f7c9-47c8-9b08-f574bb92461d") : secret "metallb-memberlist" not found Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.142908 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab7c431f-e254-4c36-a240-15ec5cbb14e9-cert\") pod \"controller-7bb4cc7c98-vlfhg\" (UID: \"ab7c431f-e254-4c36-a240-15ec5cbb14e9\") " pod="metallb-system/controller-7bb4cc7c98-vlfhg" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.144886 4580 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.157560 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab7c431f-e254-4c36-a240-15ec5cbb14e9-cert\") pod \"controller-7bb4cc7c98-vlfhg\" (UID: \"ab7c431f-e254-4c36-a240-15ec5cbb14e9\") " pod="metallb-system/controller-7bb4cc7c98-vlfhg" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.169153 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvv26\" (UniqueName: \"kubernetes.io/projected/d2b75e08-f7c9-47c8-9b08-f574bb92461d-kube-api-access-fvv26\") pod \"speaker-smqrq\" (UID: \"d2b75e08-f7c9-47c8-9b08-f574bb92461d\") " pod="metallb-system/speaker-smqrq" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.169690 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2gws\" (UniqueName: \"kubernetes.io/projected/ab7c431f-e254-4c36-a240-15ec5cbb14e9-kube-api-access-b2gws\") pod \"controller-7bb4cc7c98-vlfhg\" (UID: \"ab7c431f-e254-4c36-a240-15ec5cbb14e9\") " pod="metallb-system/controller-7bb4cc7c98-vlfhg" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.448110 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/254226c1-a87d-4bca-a3d4-a909452fa9ac-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qdhgl\" (UID: \"254226c1-a87d-4bca-a3d4-a909452fa9ac\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.448165 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e974ab22-96b8-4617-9fcf-db94114f0b0d-metrics-certs\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.451879 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e974ab22-96b8-4617-9fcf-db94114f0b0d-metrics-certs\") pod \"frr-k8s-bzwqp\" (UID: \"e974ab22-96b8-4617-9fcf-db94114f0b0d\") " pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.456488 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/254226c1-a87d-4bca-a3d4-a909452fa9ac-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qdhgl\" (UID: \"254226c1-a87d-4bca-a3d4-a909452fa9ac\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.650802 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d2b75e08-f7c9-47c8-9b08-f574bb92461d-memberlist\") pod \"speaker-smqrq\" (UID: \"d2b75e08-f7c9-47c8-9b08-f574bb92461d\") " pod="metallb-system/speaker-smqrq" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.650959 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab7c431f-e254-4c36-a240-15ec5cbb14e9-metrics-certs\") pod \"controller-7bb4cc7c98-vlfhg\" (UID: \"ab7c431f-e254-4c36-a240-15ec5cbb14e9\") " pod="metallb-system/controller-7bb4cc7c98-vlfhg" Mar 21 05:07:47 crc kubenswrapper[4580]: E0321 05:07:47.650966 4580 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.651001 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2b75e08-f7c9-47c8-9b08-f574bb92461d-metrics-certs\") pod \"speaker-smqrq\" (UID: \"d2b75e08-f7c9-47c8-9b08-f574bb92461d\") " pod="metallb-system/speaker-smqrq" Mar 21 05:07:47 crc kubenswrapper[4580]: E0321 05:07:47.651053 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2b75e08-f7c9-47c8-9b08-f574bb92461d-memberlist podName:d2b75e08-f7c9-47c8-9b08-f574bb92461d nodeName:}" failed. No retries permitted until 2026-03-21 05:07:48.651025704 +0000 UTC m=+973.733609332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d2b75e08-f7c9-47c8-9b08-f574bb92461d-memberlist") pod "speaker-smqrq" (UID: "d2b75e08-f7c9-47c8-9b08-f574bb92461d") : secret "metallb-memberlist" not found Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.656461 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab7c431f-e254-4c36-a240-15ec5cbb14e9-metrics-certs\") pod \"controller-7bb4cc7c98-vlfhg\" (UID: \"ab7c431f-e254-4c36-a240-15ec5cbb14e9\") " pod="metallb-system/controller-7bb4cc7c98-vlfhg" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.667046 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2b75e08-f7c9-47c8-9b08-f574bb92461d-metrics-certs\") pod \"speaker-smqrq\" (UID: \"d2b75e08-f7c9-47c8-9b08-f574bb92461d\") " pod="metallb-system/speaker-smqrq" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.684557 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.696266 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.842971 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-vlfhg" Mar 21 05:07:47 crc kubenswrapper[4580]: I0321 05:07:47.949058 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl"] Mar 21 05:07:47 crc kubenswrapper[4580]: W0321 05:07:47.955559 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod254226c1_a87d_4bca_a3d4_a909452fa9ac.slice/crio-fe53f62cd83280b7d86a74ffdfb6a9a5dd57515eff3ceae545bde9497ceb58d7 WatchSource:0}: Error finding container fe53f62cd83280b7d86a74ffdfb6a9a5dd57515eff3ceae545bde9497ceb58d7: Status 404 returned error can't find the container with id fe53f62cd83280b7d86a74ffdfb6a9a5dd57515eff3ceae545bde9497ceb58d7 Mar 21 05:07:48 crc kubenswrapper[4580]: I0321 05:07:48.373361 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-vlfhg"] Mar 21 05:07:48 crc kubenswrapper[4580]: W0321 05:07:48.391007 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab7c431f_e254_4c36_a240_15ec5cbb14e9.slice/crio-37581d6cab591cbb64f5e649058e0940b52c0b263f1be4b74bef7a9a0586e764 WatchSource:0}: Error finding container 37581d6cab591cbb64f5e649058e0940b52c0b263f1be4b74bef7a9a0586e764: Status 404 returned error can't find the container with id 37581d6cab591cbb64f5e649058e0940b52c0b263f1be4b74bef7a9a0586e764 Mar 21 05:07:48 crc kubenswrapper[4580]: I0321 05:07:48.667480 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d2b75e08-f7c9-47c8-9b08-f574bb92461d-memberlist\") pod \"speaker-smqrq\" (UID: \"d2b75e08-f7c9-47c8-9b08-f574bb92461d\") " pod="metallb-system/speaker-smqrq" Mar 21 05:07:48 crc kubenswrapper[4580]: I0321 05:07:48.677024 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d2b75e08-f7c9-47c8-9b08-f574bb92461d-memberlist\") pod \"speaker-smqrq\" (UID: \"d2b75e08-f7c9-47c8-9b08-f574bb92461d\") " pod="metallb-system/speaker-smqrq" Mar 21 05:07:48 crc kubenswrapper[4580]: I0321 05:07:48.679999 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl" event={"ID":"254226c1-a87d-4bca-a3d4-a909452fa9ac","Type":"ContainerStarted","Data":"fe53f62cd83280b7d86a74ffdfb6a9a5dd57515eff3ceae545bde9497ceb58d7"} Mar 21 05:07:48 crc kubenswrapper[4580]: I0321 05:07:48.681266 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bzwqp" event={"ID":"e974ab22-96b8-4617-9fcf-db94114f0b0d","Type":"ContainerStarted","Data":"0ba8e4bd55c9595ecebf5a29666c67239df52e775340c06e4bbdcb810a0b7ae4"} Mar 21 05:07:48 crc kubenswrapper[4580]: I0321 05:07:48.682642 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-vlfhg" event={"ID":"ab7c431f-e254-4c36-a240-15ec5cbb14e9","Type":"ContainerStarted","Data":"4f86eeac19e29687c9a469087fb31006fc05e46fc78a7275ed18345225f22922"} Mar 21 05:07:48 crc kubenswrapper[4580]: I0321 05:07:48.682676 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-vlfhg" event={"ID":"ab7c431f-e254-4c36-a240-15ec5cbb14e9","Type":"ContainerStarted","Data":"37581d6cab591cbb64f5e649058e0940b52c0b263f1be4b74bef7a9a0586e764"} Mar 21 05:07:48 crc kubenswrapper[4580]: I0321 05:07:48.715461 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-smqrq" Mar 21 05:07:49 crc kubenswrapper[4580]: I0321 05:07:49.695891 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-vlfhg" event={"ID":"ab7c431f-e254-4c36-a240-15ec5cbb14e9","Type":"ContainerStarted","Data":"5039d77035de3ea8232f61307eb16cbb10d3bb4da2a434f2ac6a8c80223464eb"} Mar 21 05:07:49 crc kubenswrapper[4580]: I0321 05:07:49.696337 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-vlfhg" Mar 21 05:07:49 crc kubenswrapper[4580]: I0321 05:07:49.700641 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-smqrq" event={"ID":"d2b75e08-f7c9-47c8-9b08-f574bb92461d","Type":"ContainerStarted","Data":"dd06fbfbab9864d41ba42493216792e492b8582eaf639e286a7aef3c3e7ae52f"} Mar 21 05:07:49 crc kubenswrapper[4580]: I0321 05:07:49.700684 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-smqrq" event={"ID":"d2b75e08-f7c9-47c8-9b08-f574bb92461d","Type":"ContainerStarted","Data":"13d4eb36cb9ac6fe7b2c0f9a0bf86ccadce29c155979a044b20fae85e12fed0d"} Mar 21 05:07:49 crc kubenswrapper[4580]: I0321 05:07:49.700695 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-smqrq" event={"ID":"d2b75e08-f7c9-47c8-9b08-f574bb92461d","Type":"ContainerStarted","Data":"f6334b0401c5353ce79e7258e488c27d7f6a5ee11931bc63991e43d1b28a97ca"} Mar 21 05:07:49 crc kubenswrapper[4580]: I0321 05:07:49.701016 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-smqrq" Mar 21 05:07:49 crc kubenswrapper[4580]: I0321 05:07:49.726481 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-vlfhg" podStartSLOduration=3.726454972 podStartE2EDuration="3.726454972s" podCreationTimestamp="2026-03-21 05:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:07:49.719462757 +0000 UTC m=+974.802046405" watchObservedRunningTime="2026-03-21 05:07:49.726454972 +0000 UTC m=+974.809038590" Mar 21 05:07:55 crc kubenswrapper[4580]: I0321 05:07:55.649950 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-smqrq" podStartSLOduration=9.649919205 podStartE2EDuration="9.649919205s" podCreationTimestamp="2026-03-21 05:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:07:49.759823674 +0000 UTC m=+974.842407322" watchObservedRunningTime="2026-03-21 05:07:55.649919205 +0000 UTC m=+980.732502833" Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.596386 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dm88x"] Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.599361 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.619965 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dm88x"] Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.626230 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flwms\" (UniqueName: \"kubernetes.io/projected/30341025-0539-42a5-937b-58ba2187d3ed-kube-api-access-flwms\") pod \"certified-operators-dm88x\" (UID: \"30341025-0539-42a5-937b-58ba2187d3ed\") " pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.626337 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30341025-0539-42a5-937b-58ba2187d3ed-catalog-content\") pod \"certified-operators-dm88x\" (UID: \"30341025-0539-42a5-937b-58ba2187d3ed\") " pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.626392 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30341025-0539-42a5-937b-58ba2187d3ed-utilities\") pod \"certified-operators-dm88x\" (UID: \"30341025-0539-42a5-937b-58ba2187d3ed\") " pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.721603 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-smqrq" Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.727367 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30341025-0539-42a5-937b-58ba2187d3ed-catalog-content\") pod \"certified-operators-dm88x\" (UID: \"30341025-0539-42a5-937b-58ba2187d3ed\") " pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.727426 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30341025-0539-42a5-937b-58ba2187d3ed-utilities\") pod \"certified-operators-dm88x\" (UID: \"30341025-0539-42a5-937b-58ba2187d3ed\") " pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.727475 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flwms\" (UniqueName: \"kubernetes.io/projected/30341025-0539-42a5-937b-58ba2187d3ed-kube-api-access-flwms\") pod \"certified-operators-dm88x\" (UID: \"30341025-0539-42a5-937b-58ba2187d3ed\") " pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.728266 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30341025-0539-42a5-937b-58ba2187d3ed-catalog-content\") pod \"certified-operators-dm88x\" (UID: \"30341025-0539-42a5-937b-58ba2187d3ed\") " pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.728483 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30341025-0539-42a5-937b-58ba2187d3ed-utilities\") pod \"certified-operators-dm88x\" (UID: \"30341025-0539-42a5-937b-58ba2187d3ed\") " pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.769713 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flwms\" (UniqueName: \"kubernetes.io/projected/30341025-0539-42a5-937b-58ba2187d3ed-kube-api-access-flwms\") pod \"certified-operators-dm88x\" (UID: \"30341025-0539-42a5-937b-58ba2187d3ed\") " pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.803695 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl" event={"ID":"254226c1-a87d-4bca-a3d4-a909452fa9ac","Type":"ContainerStarted","Data":"41e6c34caa048af9bdebe2d121e9395771c3beb5b9085feeb0455ace23d1047d"} Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.803991 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl" Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.805240 4580 generic.go:334] "Generic (PLEG): container finished" podID="e974ab22-96b8-4617-9fcf-db94114f0b0d" containerID="92f0cd48667339e055a2fb876b9589446f1fa219e1b5010fc330ceada3fbdc47" exitCode=0 Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.805271 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bzwqp" event={"ID":"e974ab22-96b8-4617-9fcf-db94114f0b0d","Type":"ContainerDied","Data":"92f0cd48667339e055a2fb876b9589446f1fa219e1b5010fc330ceada3fbdc47"} Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.828004 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl" podStartSLOduration=2.904457323 podStartE2EDuration="12.827983492s" podCreationTimestamp="2026-03-21 05:07:46 +0000 UTC" firstStartedPulling="2026-03-21 05:07:47.95905146 +0000 UTC m=+973.041635078" lastFinishedPulling="2026-03-21 05:07:57.882577609 +0000 UTC m=+982.965161247" observedRunningTime="2026-03-21 05:07:58.823256097 +0000 UTC m=+983.905839745" watchObservedRunningTime="2026-03-21 05:07:58.827983492 +0000 UTC m=+983.910567120" Mar 21 05:07:58 crc kubenswrapper[4580]: I0321 05:07:58.918979 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:07:59 crc kubenswrapper[4580]: I0321 05:07:59.413751 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dm88x"] Mar 21 05:07:59 crc kubenswrapper[4580]: I0321 05:07:59.812761 4580 generic.go:334] "Generic (PLEG): container finished" podID="e974ab22-96b8-4617-9fcf-db94114f0b0d" containerID="35214ea617372a3ef4d35327129daadb90fff6333d5d08de8db2b0979540df12" exitCode=0 Mar 21 05:07:59 crc kubenswrapper[4580]: I0321 05:07:59.812810 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bzwqp" event={"ID":"e974ab22-96b8-4617-9fcf-db94114f0b0d","Type":"ContainerDied","Data":"35214ea617372a3ef4d35327129daadb90fff6333d5d08de8db2b0979540df12"} Mar 21 05:07:59 crc kubenswrapper[4580]: I0321 05:07:59.816662 4580 generic.go:334] "Generic (PLEG): container finished" podID="30341025-0539-42a5-937b-58ba2187d3ed" containerID="83fa8ee72bf09090f0fe8b26e76916d8852cb40e91e683de771c8c2e2c9d8e7b" exitCode=0 Mar 21 05:07:59 crc kubenswrapper[4580]: I0321 05:07:59.816763 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm88x" event={"ID":"30341025-0539-42a5-937b-58ba2187d3ed","Type":"ContainerDied","Data":"83fa8ee72bf09090f0fe8b26e76916d8852cb40e91e683de771c8c2e2c9d8e7b"} Mar 21 05:07:59 crc kubenswrapper[4580]: I0321 05:07:59.816869 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm88x" event={"ID":"30341025-0539-42a5-937b-58ba2187d3ed","Type":"ContainerStarted","Data":"c9f3f0796135f7ad131bfa8cae0c494cec3958c17459ce88020eb4acd67a91c9"} Mar 21 05:08:00 crc kubenswrapper[4580]: I0321 05:08:00.128297 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567828-rqw6s"] Mar 21 05:08:00 crc kubenswrapper[4580]: I0321 05:08:00.129743 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-rqw6s" Mar 21 05:08:00 crc kubenswrapper[4580]: I0321 05:08:00.132465 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:08:00 crc kubenswrapper[4580]: I0321 05:08:00.132887 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:08:00 crc kubenswrapper[4580]: I0321 05:08:00.136185 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:08:00 crc kubenswrapper[4580]: I0321 05:08:00.166164 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-rqw6s"] Mar 21 05:08:00 crc kubenswrapper[4580]: I0321 05:08:00.259230 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v67m\" (UniqueName: \"kubernetes.io/projected/8604f80a-d308-4e06-8d72-3281dfdc4a6a-kube-api-access-6v67m\") pod \"auto-csr-approver-29567828-rqw6s\" (UID: \"8604f80a-d308-4e06-8d72-3281dfdc4a6a\") " pod="openshift-infra/auto-csr-approver-29567828-rqw6s" Mar 21 05:08:00 crc kubenswrapper[4580]: I0321 05:08:00.360711 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v67m\" (UniqueName: \"kubernetes.io/projected/8604f80a-d308-4e06-8d72-3281dfdc4a6a-kube-api-access-6v67m\") pod \"auto-csr-approver-29567828-rqw6s\" (UID: \"8604f80a-d308-4e06-8d72-3281dfdc4a6a\") " pod="openshift-infra/auto-csr-approver-29567828-rqw6s" Mar 21 05:08:00 crc kubenswrapper[4580]: I0321 05:08:00.385067 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v67m\" (UniqueName: \"kubernetes.io/projected/8604f80a-d308-4e06-8d72-3281dfdc4a6a-kube-api-access-6v67m\") pod \"auto-csr-approver-29567828-rqw6s\" (UID: \"8604f80a-d308-4e06-8d72-3281dfdc4a6a\") " pod="openshift-infra/auto-csr-approver-29567828-rqw6s" Mar 21 05:08:00 crc kubenswrapper[4580]: I0321 05:08:00.475181 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-rqw6s" Mar 21 05:08:00 crc kubenswrapper[4580]: I0321 05:08:00.826945 4580 generic.go:334] "Generic (PLEG): container finished" podID="e974ab22-96b8-4617-9fcf-db94114f0b0d" containerID="f278a94320d508fc2da656c78ac7a13755ff5b11828fa8fd33cc40a18be7a0f5" exitCode=0 Mar 21 05:08:00 crc kubenswrapper[4580]: I0321 05:08:00.827015 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bzwqp" event={"ID":"e974ab22-96b8-4617-9fcf-db94114f0b0d","Type":"ContainerDied","Data":"f278a94320d508fc2da656c78ac7a13755ff5b11828fa8fd33cc40a18be7a0f5"} Mar 21 05:08:00 crc kubenswrapper[4580]: I0321 05:08:00.901866 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-rqw6s"] Mar 21 05:08:00 crc kubenswrapper[4580]: W0321 05:08:00.905633 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8604f80a_d308_4e06_8d72_3281dfdc4a6a.slice/crio-2457f6edd35905b744bd62d20ae1d85886b61de283edf9c5bb066140b80c870a WatchSource:0}: Error finding container 2457f6edd35905b744bd62d20ae1d85886b61de283edf9c5bb066140b80c870a: Status 404 returned error can't find the container with id 2457f6edd35905b744bd62d20ae1d85886b61de283edf9c5bb066140b80c870a Mar 21 05:08:01 crc kubenswrapper[4580]: I0321 05:08:01.840200 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bzwqp" event={"ID":"e974ab22-96b8-4617-9fcf-db94114f0b0d","Type":"ContainerStarted","Data":"7eabd98c643454328fcaf1787a4f0034ab29ea5296b916ce0f68e294f86e90a2"} Mar 21 05:08:01 crc kubenswrapper[4580]: I0321 05:08:01.840545 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bzwqp" event={"ID":"e974ab22-96b8-4617-9fcf-db94114f0b0d","Type":"ContainerStarted","Data":"2c628b44f4184801d370ea66c8e8c3b506a6d894697a81e5c91134b37274fc97"} Mar 21 05:08:01 crc kubenswrapper[4580]: I0321 05:08:01.840563 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bzwqp" event={"ID":"e974ab22-96b8-4617-9fcf-db94114f0b0d","Type":"ContainerStarted","Data":"7a0b949000c400bc39ed85b14b1c0ad33f37408dc33c622c66b1d83586f72c86"} Mar 21 05:08:01 crc kubenswrapper[4580]: I0321 05:08:01.840595 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bzwqp" event={"ID":"e974ab22-96b8-4617-9fcf-db94114f0b0d","Type":"ContainerStarted","Data":"2c469a9750932ddf3e182b35492f3b2cff24204a3b5b0aff2d72b7a4c3164767"} Mar 21 05:08:01 crc kubenswrapper[4580]: I0321 05:08:01.841940 4580 generic.go:334] "Generic (PLEG): container finished" podID="30341025-0539-42a5-937b-58ba2187d3ed" containerID="ea7e7b483ca55b363711497425ce34f8d6330f62c485d4ae08acbbcad56cb417" exitCode=0 Mar 21 05:08:01 crc kubenswrapper[4580]: I0321 05:08:01.842036 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm88x" event={"ID":"30341025-0539-42a5-937b-58ba2187d3ed","Type":"ContainerDied","Data":"ea7e7b483ca55b363711497425ce34f8d6330f62c485d4ae08acbbcad56cb417"} Mar 21 05:08:01 crc kubenswrapper[4580]: I0321 05:08:01.843056 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567828-rqw6s" event={"ID":"8604f80a-d308-4e06-8d72-3281dfdc4a6a","Type":"ContainerStarted","Data":"2457f6edd35905b744bd62d20ae1d85886b61de283edf9c5bb066140b80c870a"} Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.589868 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xl9tp"] Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.591676 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.596841 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xl9tp"] Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.702831 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3934521-9a59-47c3-baca-2aefdc32a353-catalog-content\") pod \"community-operators-xl9tp\" (UID: \"e3934521-9a59-47c3-baca-2aefdc32a353\") " pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.702911 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8jdz\" (UniqueName: \"kubernetes.io/projected/e3934521-9a59-47c3-baca-2aefdc32a353-kube-api-access-q8jdz\") pod \"community-operators-xl9tp\" (UID: \"e3934521-9a59-47c3-baca-2aefdc32a353\") " pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.703029 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3934521-9a59-47c3-baca-2aefdc32a353-utilities\") pod \"community-operators-xl9tp\" (UID: \"e3934521-9a59-47c3-baca-2aefdc32a353\") " pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.804034 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3934521-9a59-47c3-baca-2aefdc32a353-utilities\") pod \"community-operators-xl9tp\" (UID: \"e3934521-9a59-47c3-baca-2aefdc32a353\") " pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.804110 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3934521-9a59-47c3-baca-2aefdc32a353-catalog-content\") pod \"community-operators-xl9tp\" (UID: \"e3934521-9a59-47c3-baca-2aefdc32a353\") " pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.804173 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8jdz\" (UniqueName: \"kubernetes.io/projected/e3934521-9a59-47c3-baca-2aefdc32a353-kube-api-access-q8jdz\") pod \"community-operators-xl9tp\" (UID: \"e3934521-9a59-47c3-baca-2aefdc32a353\") " pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.805175 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3934521-9a59-47c3-baca-2aefdc32a353-utilities\") pod \"community-operators-xl9tp\" (UID: \"e3934521-9a59-47c3-baca-2aefdc32a353\") " pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.805411 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3934521-9a59-47c3-baca-2aefdc32a353-catalog-content\") pod \"community-operators-xl9tp\" (UID: \"e3934521-9a59-47c3-baca-2aefdc32a353\") " pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.837474 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8jdz\" (UniqueName: \"kubernetes.io/projected/e3934521-9a59-47c3-baca-2aefdc32a353-kube-api-access-q8jdz\") pod \"community-operators-xl9tp\" (UID: \"e3934521-9a59-47c3-baca-2aefdc32a353\") " pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.861743 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bzwqp" event={"ID":"e974ab22-96b8-4617-9fcf-db94114f0b0d","Type":"ContainerStarted","Data":"e8bc6019be48d7c02b7991071df046e49b498a4e25b7a005d4f622e9ff568793"} Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.862028 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bzwqp" event={"ID":"e974ab22-96b8-4617-9fcf-db94114f0b0d","Type":"ContainerStarted","Data":"6cf1078553f643d7df8265e4c78dddd976487886c5b4de9a87030d9ab3a13649"} Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.862213 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.901325 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-bzwqp" podStartSLOduration=6.911873094 podStartE2EDuration="16.901304456s" podCreationTimestamp="2026-03-21 05:07:46 +0000 UTC" firstStartedPulling="2026-03-21 05:07:47.872333747 +0000 UTC m=+972.954917375" lastFinishedPulling="2026-03-21 05:07:57.861765109 +0000 UTC m=+982.944348737" observedRunningTime="2026-03-21 05:08:02.895284837 +0000 UTC m=+987.977868475" watchObservedRunningTime="2026-03-21 05:08:02.901304456 +0000 UTC m=+987.983888094" Mar 21 05:08:02 crc kubenswrapper[4580]: I0321 05:08:02.914710 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:03 crc kubenswrapper[4580]: I0321 05:08:03.227383 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xl9tp"] Mar 21 05:08:03 crc kubenswrapper[4580]: I0321 05:08:03.876503 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567828-rqw6s" event={"ID":"8604f80a-d308-4e06-8d72-3281dfdc4a6a","Type":"ContainerStarted","Data":"71f777cfaf78023adc21f19c8de59042d08f29ef677ffe40192603ca913efe27"} Mar 21 05:08:03 crc kubenswrapper[4580]: I0321 05:08:03.878377 4580 generic.go:334] "Generic (PLEG): container finished" podID="e3934521-9a59-47c3-baca-2aefdc32a353" containerID="d0fa01d6d003281d95405b496129af20dc8d758e65d50ed2fb36b95cb4a6fe87" exitCode=0 Mar 21 05:08:03 crc kubenswrapper[4580]: I0321 05:08:03.879908 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xl9tp" event={"ID":"e3934521-9a59-47c3-baca-2aefdc32a353","Type":"ContainerDied","Data":"d0fa01d6d003281d95405b496129af20dc8d758e65d50ed2fb36b95cb4a6fe87"} Mar 21 05:08:03 crc kubenswrapper[4580]: I0321 05:08:03.879966 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xl9tp" event={"ID":"e3934521-9a59-47c3-baca-2aefdc32a353","Type":"ContainerStarted","Data":"972ca05b544aaa8d7d65a59278e5202de5e01012fb5f02416cb6195a9c38e133"} Mar 21 05:08:03 crc kubenswrapper[4580]: I0321 05:08:03.924281 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567828-rqw6s" podStartSLOduration=2.267421112 podStartE2EDuration="3.924254029s" podCreationTimestamp="2026-03-21 05:08:00 +0000 UTC" firstStartedPulling="2026-03-21 05:08:00.917919353 +0000 UTC m=+986.000502981" lastFinishedPulling="2026-03-21 05:08:02.57475227 +0000 UTC m=+987.657335898" observedRunningTime="2026-03-21 05:08:03.919763791 +0000 UTC m=+989.002347419" watchObservedRunningTime="2026-03-21 05:08:03.924254029 +0000 UTC m=+989.006837657" Mar 21 05:08:04 crc kubenswrapper[4580]: I0321 05:08:04.887582 4580 generic.go:334] "Generic (PLEG): container finished" podID="8604f80a-d308-4e06-8d72-3281dfdc4a6a" containerID="71f777cfaf78023adc21f19c8de59042d08f29ef677ffe40192603ca913efe27" exitCode=0 Mar 21 05:08:04 crc kubenswrapper[4580]: I0321 05:08:04.887942 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567828-rqw6s" event={"ID":"8604f80a-d308-4e06-8d72-3281dfdc4a6a","Type":"ContainerDied","Data":"71f777cfaf78023adc21f19c8de59042d08f29ef677ffe40192603ca913efe27"} Mar 21 05:08:05 crc kubenswrapper[4580]: I0321 05:08:05.379901 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-r2tm2"] Mar 21 05:08:05 crc kubenswrapper[4580]: I0321 05:08:05.380956 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r2tm2" Mar 21 05:08:05 crc kubenswrapper[4580]: I0321 05:08:05.385839 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-lkwhz" Mar 21 05:08:05 crc kubenswrapper[4580]: I0321 05:08:05.385879 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 21 05:08:05 crc kubenswrapper[4580]: I0321 05:08:05.386137 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 21 05:08:05 crc kubenswrapper[4580]: I0321 05:08:05.397760 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-r2tm2"] Mar 21 05:08:05 crc kubenswrapper[4580]: I0321 05:08:05.440425 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwh66\" (UniqueName: \"kubernetes.io/projected/0b8cf1e5-6f84-4595-be65-efc781baa914-kube-api-access-zwh66\") pod \"openstack-operator-index-r2tm2\" (UID: \"0b8cf1e5-6f84-4595-be65-efc781baa914\") " pod="openstack-operators/openstack-operator-index-r2tm2" Mar 21 05:08:05 crc kubenswrapper[4580]: I0321 05:08:05.543353 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwh66\" (UniqueName: \"kubernetes.io/projected/0b8cf1e5-6f84-4595-be65-efc781baa914-kube-api-access-zwh66\") pod \"openstack-operator-index-r2tm2\" (UID: \"0b8cf1e5-6f84-4595-be65-efc781baa914\") " pod="openstack-operators/openstack-operator-index-r2tm2" Mar 21 05:08:05 crc kubenswrapper[4580]: I0321 05:08:05.568448 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwh66\" (UniqueName: \"kubernetes.io/projected/0b8cf1e5-6f84-4595-be65-efc781baa914-kube-api-access-zwh66\") pod \"openstack-operator-index-r2tm2\" (UID: \"0b8cf1e5-6f84-4595-be65-efc781baa914\") " pod="openstack-operators/openstack-operator-index-r2tm2" Mar 21 05:08:05 crc kubenswrapper[4580]: I0321 05:08:05.696094 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r2tm2" Mar 21 05:08:05 crc kubenswrapper[4580]: I0321 05:08:05.910167 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm88x" event={"ID":"30341025-0539-42a5-937b-58ba2187d3ed","Type":"ContainerStarted","Data":"1cc573521ef29abab4dbd645d89dda1974b2170a8e6cd9ea6e38f98cdeaaf433"} Mar 21 05:08:05 crc kubenswrapper[4580]: I0321 05:08:05.920222 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-r2tm2"] Mar 21 05:08:05 crc kubenswrapper[4580]: I0321 05:08:05.926683 4580 generic.go:334] "Generic (PLEG): container finished" podID="e3934521-9a59-47c3-baca-2aefdc32a353" containerID="b6ec2ce6ffb814527a81373568e8340e46f8ea7aaab27290ed1aadc715bd441e" exitCode=0 Mar 21 05:08:05 crc kubenswrapper[4580]: I0321 05:08:05.926755 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xl9tp" event={"ID":"e3934521-9a59-47c3-baca-2aefdc32a353","Type":"ContainerDied","Data":"b6ec2ce6ffb814527a81373568e8340e46f8ea7aaab27290ed1aadc715bd441e"} Mar 21 05:08:05 crc kubenswrapper[4580]: I0321 05:08:05.943199 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dm88x" podStartSLOduration=2.881016567 podStartE2EDuration="7.943181472s" podCreationTimestamp="2026-03-21 05:07:58 +0000 UTC" firstStartedPulling="2026-03-21 05:07:59.818155149 +0000 UTC m=+984.900738777" lastFinishedPulling="2026-03-21 05:08:04.880320054 +0000 UTC m=+989.962903682" observedRunningTime="2026-03-21 05:08:05.93817266 +0000 UTC m=+991.020756298" watchObservedRunningTime="2026-03-21 05:08:05.943181472 +0000 UTC m=+991.025765100" Mar 21 05:08:05 crc kubenswrapper[4580]: W0321 05:08:05.953567 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b8cf1e5_6f84_4595_be65_efc781baa914.slice/crio-c925238025edc3232bff1b467c12fda2080875692f66197b169a1a688ae0d070 WatchSource:0}: Error finding container c925238025edc3232bff1b467c12fda2080875692f66197b169a1a688ae0d070: Status 404 returned error can't find the container with id c925238025edc3232bff1b467c12fda2080875692f66197b169a1a688ae0d070 Mar 21 05:08:06 crc kubenswrapper[4580]: I0321 05:08:06.253455 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-rqw6s" Mar 21 05:08:06 crc kubenswrapper[4580]: I0321 05:08:06.353690 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v67m\" (UniqueName: \"kubernetes.io/projected/8604f80a-d308-4e06-8d72-3281dfdc4a6a-kube-api-access-6v67m\") pod \"8604f80a-d308-4e06-8d72-3281dfdc4a6a\" (UID: \"8604f80a-d308-4e06-8d72-3281dfdc4a6a\") " Mar 21 05:08:06 crc kubenswrapper[4580]: I0321 05:08:06.377499 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8604f80a-d308-4e06-8d72-3281dfdc4a6a-kube-api-access-6v67m" (OuterVolumeSpecName: "kube-api-access-6v67m") pod "8604f80a-d308-4e06-8d72-3281dfdc4a6a" (UID: "8604f80a-d308-4e06-8d72-3281dfdc4a6a"). InnerVolumeSpecName "kube-api-access-6v67m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:06 crc kubenswrapper[4580]: I0321 05:08:06.455288 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v67m\" (UniqueName: \"kubernetes.io/projected/8604f80a-d308-4e06-8d72-3281dfdc4a6a-kube-api-access-6v67m\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:06 crc kubenswrapper[4580]: I0321 05:08:06.940102 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xl9tp" event={"ID":"e3934521-9a59-47c3-baca-2aefdc32a353","Type":"ContainerStarted","Data":"ab7e3baa5e8acfff7a8054c3e852293c0430241ea8e0c6ac1acd93297c379a27"} Mar 21 05:08:06 crc kubenswrapper[4580]: I0321 05:08:06.945430 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r2tm2" event={"ID":"0b8cf1e5-6f84-4595-be65-efc781baa914","Type":"ContainerStarted","Data":"c925238025edc3232bff1b467c12fda2080875692f66197b169a1a688ae0d070"} Mar 21 05:08:06 crc kubenswrapper[4580]: I0321 05:08:06.948209 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-rqw6s" Mar 21 05:08:06 crc kubenswrapper[4580]: I0321 05:08:06.958017 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567828-rqw6s" event={"ID":"8604f80a-d308-4e06-8d72-3281dfdc4a6a","Type":"ContainerDied","Data":"2457f6edd35905b744bd62d20ae1d85886b61de283edf9c5bb066140b80c870a"} Mar 21 05:08:06 crc kubenswrapper[4580]: I0321 05:08:06.958064 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2457f6edd35905b744bd62d20ae1d85886b61de283edf9c5bb066140b80c870a" Mar 21 05:08:06 crc kubenswrapper[4580]: I0321 05:08:06.966193 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xl9tp" podStartSLOduration=2.367995494 podStartE2EDuration="4.966170096s" podCreationTimestamp="2026-03-21 05:08:02 +0000 UTC" firstStartedPulling="2026-03-21 05:08:03.881485728 +0000 UTC m=+988.964069356" lastFinishedPulling="2026-03-21 05:08:06.47966033 +0000 UTC m=+991.562243958" observedRunningTime="2026-03-21 05:08:06.962960121 +0000 UTC m=+992.045543749" watchObservedRunningTime="2026-03-21 05:08:06.966170096 +0000 UTC m=+992.048753724" Mar 21 05:08:07 crc kubenswrapper[4580]: I0321 05:08:07.322749 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-t4ldw"] Mar 21 05:08:07 crc kubenswrapper[4580]: I0321 05:08:07.326441 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-t4ldw"] Mar 21 05:08:07 crc kubenswrapper[4580]: I0321 05:08:07.628304 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f774fe43-2dcb-4e22-b54a-db3e1b969706" path="/var/lib/kubelet/pods/f774fe43-2dcb-4e22-b54a-db3e1b969706/volumes" Mar 21 05:08:07 crc kubenswrapper[4580]: I0321 05:08:07.696775 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:08:07 crc kubenswrapper[4580]: I0321 05:08:07.745583 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:08:07 crc kubenswrapper[4580]: I0321 05:08:07.851909 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-vlfhg" Mar 21 05:08:08 crc kubenswrapper[4580]: I0321 05:08:08.919511 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:08:08 crc kubenswrapper[4580]: I0321 05:08:08.919544 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:08:08 crc kubenswrapper[4580]: I0321 05:08:08.965334 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:08:09 crc kubenswrapper[4580]: I0321 05:08:09.970849 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r2tm2" event={"ID":"0b8cf1e5-6f84-4595-be65-efc781baa914","Type":"ContainerStarted","Data":"ff9d1ab2021fc5834ab3a34a2dbf4978688b45922669301f26af7b6e168b4931"} Mar 21 05:08:12 crc kubenswrapper[4580]: I0321 05:08:12.914831 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:12 crc kubenswrapper[4580]: I0321 05:08:12.915687 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:12 crc kubenswrapper[4580]: I0321 05:08:12.964085 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:12 crc kubenswrapper[4580]: I0321 05:08:12.989212 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-r2tm2" podStartSLOduration=4.469245563 podStartE2EDuration="7.989195372s" podCreationTimestamp="2026-03-21 05:08:05 +0000 UTC" firstStartedPulling="2026-03-21 05:08:05.95821377 +0000 UTC m=+991.040797388" lastFinishedPulling="2026-03-21 05:08:09.478163569 +0000 UTC m=+994.560747197" observedRunningTime="2026-03-21 05:08:09.994167115 +0000 UTC m=+995.076750753" watchObservedRunningTime="2026-03-21 05:08:12.989195372 +0000 UTC m=+998.071778990" Mar 21 05:08:13 crc kubenswrapper[4580]: I0321 05:08:13.030138 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:14 crc kubenswrapper[4580]: I0321 05:08:14.768023 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xl9tp"] Mar 21 05:08:14 crc kubenswrapper[4580]: I0321 05:08:14.998678 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xl9tp" podUID="e3934521-9a59-47c3-baca-2aefdc32a353" containerName="registry-server" containerID="cri-o://ab7e3baa5e8acfff7a8054c3e852293c0430241ea8e0c6ac1acd93297c379a27" gracePeriod=2 Mar 21 05:08:15 crc kubenswrapper[4580]: I0321 05:08:15.364226 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:15 crc kubenswrapper[4580]: I0321 05:08:15.471072 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3934521-9a59-47c3-baca-2aefdc32a353-catalog-content\") pod \"e3934521-9a59-47c3-baca-2aefdc32a353\" (UID: \"e3934521-9a59-47c3-baca-2aefdc32a353\") " Mar 21 05:08:15 crc kubenswrapper[4580]: I0321 05:08:15.471119 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8jdz\" (UniqueName: \"kubernetes.io/projected/e3934521-9a59-47c3-baca-2aefdc32a353-kube-api-access-q8jdz\") pod \"e3934521-9a59-47c3-baca-2aefdc32a353\" (UID: \"e3934521-9a59-47c3-baca-2aefdc32a353\") " Mar 21 05:08:15 crc kubenswrapper[4580]: I0321 05:08:15.471180 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3934521-9a59-47c3-baca-2aefdc32a353-utilities\") pod \"e3934521-9a59-47c3-baca-2aefdc32a353\" (UID: \"e3934521-9a59-47c3-baca-2aefdc32a353\") " Mar 21 05:08:15 crc kubenswrapper[4580]: I0321 05:08:15.471945 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3934521-9a59-47c3-baca-2aefdc32a353-utilities" (OuterVolumeSpecName: "utilities") pod "e3934521-9a59-47c3-baca-2aefdc32a353" (UID: "e3934521-9a59-47c3-baca-2aefdc32a353"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:15 crc kubenswrapper[4580]: I0321 05:08:15.477377 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3934521-9a59-47c3-baca-2aefdc32a353-kube-api-access-q8jdz" (OuterVolumeSpecName: "kube-api-access-q8jdz") pod "e3934521-9a59-47c3-baca-2aefdc32a353" (UID: "e3934521-9a59-47c3-baca-2aefdc32a353"). InnerVolumeSpecName "kube-api-access-q8jdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:15 crc kubenswrapper[4580]: I0321 05:08:15.521357 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3934521-9a59-47c3-baca-2aefdc32a353-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3934521-9a59-47c3-baca-2aefdc32a353" (UID: "e3934521-9a59-47c3-baca-2aefdc32a353"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:15 crc kubenswrapper[4580]: I0321 05:08:15.574034 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3934521-9a59-47c3-baca-2aefdc32a353-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:15 crc kubenswrapper[4580]: I0321 05:08:15.574070 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8jdz\" (UniqueName: \"kubernetes.io/projected/e3934521-9a59-47c3-baca-2aefdc32a353-kube-api-access-q8jdz\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:15 crc kubenswrapper[4580]: I0321 05:08:15.574084 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3934521-9a59-47c3-baca-2aefdc32a353-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:15 crc kubenswrapper[4580]: I0321 05:08:15.697139 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-r2tm2" Mar 21 05:08:15 crc kubenswrapper[4580]: I0321 05:08:15.697188 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-r2tm2" Mar 21 05:08:15 crc kubenswrapper[4580]: I0321 05:08:15.723694 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-r2tm2" Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.006589 4580 generic.go:334] "Generic (PLEG): container finished" podID="e3934521-9a59-47c3-baca-2aefdc32a353" containerID="ab7e3baa5e8acfff7a8054c3e852293c0430241ea8e0c6ac1acd93297c379a27" exitCode=0 Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.007024 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xl9tp" Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.007497 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xl9tp" event={"ID":"e3934521-9a59-47c3-baca-2aefdc32a353","Type":"ContainerDied","Data":"ab7e3baa5e8acfff7a8054c3e852293c0430241ea8e0c6ac1acd93297c379a27"} Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.007534 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xl9tp" event={"ID":"e3934521-9a59-47c3-baca-2aefdc32a353","Type":"ContainerDied","Data":"972ca05b544aaa8d7d65a59278e5202de5e01012fb5f02416cb6195a9c38e133"} Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.007558 4580 scope.go:117] "RemoveContainer" containerID="ab7e3baa5e8acfff7a8054c3e852293c0430241ea8e0c6ac1acd93297c379a27" Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.029808 4580 scope.go:117] "RemoveContainer" containerID="b6ec2ce6ffb814527a81373568e8340e46f8ea7aaab27290ed1aadc715bd441e" Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.032453 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xl9tp"] Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.040363 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-r2tm2" Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.041901 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xl9tp"] Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.054902 4580 scope.go:117] "RemoveContainer" containerID="d0fa01d6d003281d95405b496129af20dc8d758e65d50ed2fb36b95cb4a6fe87" Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.090205 4580 scope.go:117] "RemoveContainer" containerID="ab7e3baa5e8acfff7a8054c3e852293c0430241ea8e0c6ac1acd93297c379a27" Mar 21 05:08:16 crc kubenswrapper[4580]: E0321 05:08:16.090784 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab7e3baa5e8acfff7a8054c3e852293c0430241ea8e0c6ac1acd93297c379a27\": container with ID starting with ab7e3baa5e8acfff7a8054c3e852293c0430241ea8e0c6ac1acd93297c379a27 not found: ID does not exist" containerID="ab7e3baa5e8acfff7a8054c3e852293c0430241ea8e0c6ac1acd93297c379a27" Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.090846 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab7e3baa5e8acfff7a8054c3e852293c0430241ea8e0c6ac1acd93297c379a27"} err="failed to get container status \"ab7e3baa5e8acfff7a8054c3e852293c0430241ea8e0c6ac1acd93297c379a27\": rpc error: code = NotFound desc = could not find container \"ab7e3baa5e8acfff7a8054c3e852293c0430241ea8e0c6ac1acd93297c379a27\": container with ID starting with ab7e3baa5e8acfff7a8054c3e852293c0430241ea8e0c6ac1acd93297c379a27 not found: ID does not exist" Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.090873 4580 scope.go:117] "RemoveContainer" containerID="b6ec2ce6ffb814527a81373568e8340e46f8ea7aaab27290ed1aadc715bd441e" Mar 21 05:08:16 crc kubenswrapper[4580]: E0321 05:08:16.091322 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6ec2ce6ffb814527a81373568e8340e46f8ea7aaab27290ed1aadc715bd441e\": container with ID starting with b6ec2ce6ffb814527a81373568e8340e46f8ea7aaab27290ed1aadc715bd441e not found: ID does not exist" containerID="b6ec2ce6ffb814527a81373568e8340e46f8ea7aaab27290ed1aadc715bd441e" Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.091353 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ec2ce6ffb814527a81373568e8340e46f8ea7aaab27290ed1aadc715bd441e"} err="failed to get container status \"b6ec2ce6ffb814527a81373568e8340e46f8ea7aaab27290ed1aadc715bd441e\": rpc error: code = NotFound desc = could not find container \"b6ec2ce6ffb814527a81373568e8340e46f8ea7aaab27290ed1aadc715bd441e\": container with ID starting with b6ec2ce6ffb814527a81373568e8340e46f8ea7aaab27290ed1aadc715bd441e not found: ID does not exist" Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.091373 4580 scope.go:117] "RemoveContainer" containerID="d0fa01d6d003281d95405b496129af20dc8d758e65d50ed2fb36b95cb4a6fe87" Mar 21 05:08:16 crc kubenswrapper[4580]: E0321 05:08:16.091666 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0fa01d6d003281d95405b496129af20dc8d758e65d50ed2fb36b95cb4a6fe87\": container with ID starting with d0fa01d6d003281d95405b496129af20dc8d758e65d50ed2fb36b95cb4a6fe87 not found: ID does not exist" containerID="d0fa01d6d003281d95405b496129af20dc8d758e65d50ed2fb36b95cb4a6fe87" Mar 21 05:08:16 crc kubenswrapper[4580]: I0321 05:08:16.091711 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0fa01d6d003281d95405b496129af20dc8d758e65d50ed2fb36b95cb4a6fe87"} err="failed to get container status \"d0fa01d6d003281d95405b496129af20dc8d758e65d50ed2fb36b95cb4a6fe87\": rpc error: code = NotFound desc = could not find container \"d0fa01d6d003281d95405b496129af20dc8d758e65d50ed2fb36b95cb4a6fe87\": container with ID starting with d0fa01d6d003281d95405b496129af20dc8d758e65d50ed2fb36b95cb4a6fe87 not found: ID does not exist" Mar 21 05:08:17 crc kubenswrapper[4580]: I0321 05:08:17.625250 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3934521-9a59-47c3-baca-2aefdc32a353" path="/var/lib/kubelet/pods/e3934521-9a59-47c3-baca-2aefdc32a353/volumes" Mar 21 05:08:17 crc kubenswrapper[4580]: I0321 05:08:17.689741 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qdhgl" Mar 21 05:08:17 crc kubenswrapper[4580]: I0321 05:08:17.704734 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-bzwqp" Mar 21 05:08:17 crc kubenswrapper[4580]: I0321 05:08:17.828635 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4"] Mar 21 05:08:17 crc kubenswrapper[4580]: E0321 05:08:17.828885 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3934521-9a59-47c3-baca-2aefdc32a353" containerName="extract-utilities" Mar 21 05:08:17 crc kubenswrapper[4580]: I0321 05:08:17.828898 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3934521-9a59-47c3-baca-2aefdc32a353" containerName="extract-utilities" Mar 21 05:08:17 crc kubenswrapper[4580]: E0321 05:08:17.828915 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3934521-9a59-47c3-baca-2aefdc32a353" containerName="extract-content" Mar 21 05:08:17 crc kubenswrapper[4580]: I0321 05:08:17.828923 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3934521-9a59-47c3-baca-2aefdc32a353" containerName="extract-content" Mar 21 05:08:17 crc kubenswrapper[4580]: E0321 05:08:17.828935 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3934521-9a59-47c3-baca-2aefdc32a353" containerName="registry-server" Mar 21 05:08:17 crc kubenswrapper[4580]: I0321 05:08:17.828941 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3934521-9a59-47c3-baca-2aefdc32a353" containerName="registry-server" Mar 21 05:08:17 crc kubenswrapper[4580]: E0321 05:08:17.828954 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8604f80a-d308-4e06-8d72-3281dfdc4a6a" containerName="oc" Mar 21 05:08:17 crc kubenswrapper[4580]: I0321 05:08:17.828959 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8604f80a-d308-4e06-8d72-3281dfdc4a6a" containerName="oc" Mar 21 05:08:17 crc kubenswrapper[4580]: I0321 05:08:17.829081 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3934521-9a59-47c3-baca-2aefdc32a353" containerName="registry-server" Mar 21 05:08:17 crc kubenswrapper[4580]: I0321 05:08:17.829091 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="8604f80a-d308-4e06-8d72-3281dfdc4a6a" containerName="oc" Mar 21 05:08:17 crc kubenswrapper[4580]: I0321 05:08:17.829943 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" Mar 21 05:08:17 crc kubenswrapper[4580]: I0321 05:08:17.832713 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-jkfp4" Mar 21 05:08:17 crc kubenswrapper[4580]: I0321 05:08:17.838222 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4"] Mar 21 05:08:18 crc kubenswrapper[4580]: I0321 05:08:18.012695 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1afb6781-0e31-4285-aac3-f6ad107c14e5-bundle\") pod \"e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4\" (UID: \"1afb6781-0e31-4285-aac3-f6ad107c14e5\") " pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" Mar 21 05:08:18 crc kubenswrapper[4580]: I0321 05:08:18.012762 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p59c9\" (UniqueName: \"kubernetes.io/projected/1afb6781-0e31-4285-aac3-f6ad107c14e5-kube-api-access-p59c9\") pod \"e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4\" (UID: \"1afb6781-0e31-4285-aac3-f6ad107c14e5\") " pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" Mar 21 05:08:18 crc kubenswrapper[4580]: I0321 05:08:18.012839 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1afb6781-0e31-4285-aac3-f6ad107c14e5-util\") pod \"e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4\" (UID: \"1afb6781-0e31-4285-aac3-f6ad107c14e5\") " pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" Mar 21 05:08:18 crc kubenswrapper[4580]: I0321 05:08:18.114170 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1afb6781-0e31-4285-aac3-f6ad107c14e5-bundle\") pod \"e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4\" (UID: \"1afb6781-0e31-4285-aac3-f6ad107c14e5\") " pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" Mar 21 05:08:18 crc kubenswrapper[4580]: I0321 05:08:18.114235 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p59c9\" (UniqueName: \"kubernetes.io/projected/1afb6781-0e31-4285-aac3-f6ad107c14e5-kube-api-access-p59c9\") pod \"e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4\" (UID: \"1afb6781-0e31-4285-aac3-f6ad107c14e5\") " pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" Mar 21 05:08:18 crc kubenswrapper[4580]: I0321 05:08:18.114275 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1afb6781-0e31-4285-aac3-f6ad107c14e5-util\") pod \"e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4\" (UID: \"1afb6781-0e31-4285-aac3-f6ad107c14e5\") " pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" Mar 21 05:08:18 crc kubenswrapper[4580]: I0321 05:08:18.114860 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1afb6781-0e31-4285-aac3-f6ad107c14e5-util\") pod \"e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4\" (UID: \"1afb6781-0e31-4285-aac3-f6ad107c14e5\") " pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" Mar 21 05:08:18 crc kubenswrapper[4580]: I0321 05:08:18.115102 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1afb6781-0e31-4285-aac3-f6ad107c14e5-bundle\") pod \"e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4\" (UID: \"1afb6781-0e31-4285-aac3-f6ad107c14e5\") " pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" Mar 21 05:08:18 crc kubenswrapper[4580]: I0321 05:08:18.132949 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p59c9\" (UniqueName: \"kubernetes.io/projected/1afb6781-0e31-4285-aac3-f6ad107c14e5-kube-api-access-p59c9\") pod \"e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4\" (UID: \"1afb6781-0e31-4285-aac3-f6ad107c14e5\") " pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" Mar 21 05:08:18 crc kubenswrapper[4580]: I0321 05:08:18.146340 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" Mar 21 05:08:18 crc kubenswrapper[4580]: I0321 05:08:18.601393 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4"] Mar 21 05:08:18 crc kubenswrapper[4580]: I0321 05:08:18.967403 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:08:19 crc kubenswrapper[4580]: I0321 05:08:19.027282 4580 generic.go:334] "Generic (PLEG): container finished" podID="1afb6781-0e31-4285-aac3-f6ad107c14e5" containerID="df875cc35bddcb01b788111dbc775c375f9d6fdc792260f69e259153b0f110da" exitCode=0 Mar 21 05:08:19 crc kubenswrapper[4580]: I0321 05:08:19.027325 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" event={"ID":"1afb6781-0e31-4285-aac3-f6ad107c14e5","Type":"ContainerDied","Data":"df875cc35bddcb01b788111dbc775c375f9d6fdc792260f69e259153b0f110da"} Mar 21 05:08:19 crc kubenswrapper[4580]: I0321 05:08:19.027359 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" event={"ID":"1afb6781-0e31-4285-aac3-f6ad107c14e5","Type":"ContainerStarted","Data":"451ba8f887c4f19c80ad35bb751473509badb73bd77611a41a0c7ea0a72a9de8"} Mar 21 05:08:20 crc kubenswrapper[4580]: I0321 05:08:20.034642 4580 generic.go:334] "Generic (PLEG): container finished" podID="1afb6781-0e31-4285-aac3-f6ad107c14e5" containerID="b1ead13886ffa6db9226c22bcace550800d253da50cebeb92c8675cccb622420" exitCode=0 Mar 21 05:08:20 crc kubenswrapper[4580]: I0321 05:08:20.034707 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" event={"ID":"1afb6781-0e31-4285-aac3-f6ad107c14e5","Type":"ContainerDied","Data":"b1ead13886ffa6db9226c22bcace550800d253da50cebeb92c8675cccb622420"} Mar 21 05:08:21 crc kubenswrapper[4580]: I0321 05:08:21.044496 4580 generic.go:334] "Generic (PLEG): container finished" podID="1afb6781-0e31-4285-aac3-f6ad107c14e5" containerID="2693a615d2a24d0ba1247427789c8cc518d46404172f883330643821f545e45a" exitCode=0 Mar 21 05:08:21 crc kubenswrapper[4580]: I0321 05:08:21.044572 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" event={"ID":"1afb6781-0e31-4285-aac3-f6ad107c14e5","Type":"ContainerDied","Data":"2693a615d2a24d0ba1247427789c8cc518d46404172f883330643821f545e45a"} Mar 21 05:08:22 crc kubenswrapper[4580]: I0321 05:08:22.299178 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" Mar 21 05:08:22 crc kubenswrapper[4580]: I0321 05:08:22.475507 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p59c9\" (UniqueName: \"kubernetes.io/projected/1afb6781-0e31-4285-aac3-f6ad107c14e5-kube-api-access-p59c9\") pod \"1afb6781-0e31-4285-aac3-f6ad107c14e5\" (UID: \"1afb6781-0e31-4285-aac3-f6ad107c14e5\") " Mar 21 05:08:22 crc kubenswrapper[4580]: I0321 05:08:22.475937 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1afb6781-0e31-4285-aac3-f6ad107c14e5-util\") pod \"1afb6781-0e31-4285-aac3-f6ad107c14e5\" (UID: \"1afb6781-0e31-4285-aac3-f6ad107c14e5\") " Mar 21 05:08:22 crc kubenswrapper[4580]: I0321 05:08:22.476154 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1afb6781-0e31-4285-aac3-f6ad107c14e5-bundle\") pod \"1afb6781-0e31-4285-aac3-f6ad107c14e5\" (UID: \"1afb6781-0e31-4285-aac3-f6ad107c14e5\") " Mar 21 05:08:22 crc kubenswrapper[4580]: I0321 05:08:22.476842 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1afb6781-0e31-4285-aac3-f6ad107c14e5-bundle" (OuterVolumeSpecName: "bundle") pod "1afb6781-0e31-4285-aac3-f6ad107c14e5" (UID: "1afb6781-0e31-4285-aac3-f6ad107c14e5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:22 crc kubenswrapper[4580]: I0321 05:08:22.484953 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1afb6781-0e31-4285-aac3-f6ad107c14e5-kube-api-access-p59c9" (OuterVolumeSpecName: "kube-api-access-p59c9") pod "1afb6781-0e31-4285-aac3-f6ad107c14e5" (UID: "1afb6781-0e31-4285-aac3-f6ad107c14e5"). InnerVolumeSpecName "kube-api-access-p59c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:22 crc kubenswrapper[4580]: I0321 05:08:22.489105 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1afb6781-0e31-4285-aac3-f6ad107c14e5-util" (OuterVolumeSpecName: "util") pod "1afb6781-0e31-4285-aac3-f6ad107c14e5" (UID: "1afb6781-0e31-4285-aac3-f6ad107c14e5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:22 crc kubenswrapper[4580]: I0321 05:08:22.577911 4580 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1afb6781-0e31-4285-aac3-f6ad107c14e5-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:22 crc kubenswrapper[4580]: I0321 05:08:22.577967 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p59c9\" (UniqueName: \"kubernetes.io/projected/1afb6781-0e31-4285-aac3-f6ad107c14e5-kube-api-access-p59c9\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:22 crc kubenswrapper[4580]: I0321 05:08:22.577998 4580 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1afb6781-0e31-4285-aac3-f6ad107c14e5-util\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:23 crc kubenswrapper[4580]: I0321 05:08:23.061333 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" event={"ID":"1afb6781-0e31-4285-aac3-f6ad107c14e5","Type":"ContainerDied","Data":"451ba8f887c4f19c80ad35bb751473509badb73bd77611a41a0c7ea0a72a9de8"} Mar 21 05:08:23 crc kubenswrapper[4580]: I0321 05:08:23.061371 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="451ba8f887c4f19c80ad35bb751473509badb73bd77611a41a0c7ea0a72a9de8" Mar 21 05:08:23 crc kubenswrapper[4580]: I0321 05:08:23.061495 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4" Mar 21 05:08:23 crc kubenswrapper[4580]: I0321 05:08:23.366932 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dm88x"] Mar 21 05:08:23 crc kubenswrapper[4580]: I0321 05:08:23.368966 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dm88x" podUID="30341025-0539-42a5-937b-58ba2187d3ed" containerName="registry-server" containerID="cri-o://1cc573521ef29abab4dbd645d89dda1974b2170a8e6cd9ea6e38f98cdeaaf433" gracePeriod=2 Mar 21 05:08:23 crc kubenswrapper[4580]: I0321 05:08:23.749184 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:08:23 crc kubenswrapper[4580]: I0321 05:08:23.896057 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flwms\" (UniqueName: \"kubernetes.io/projected/30341025-0539-42a5-937b-58ba2187d3ed-kube-api-access-flwms\") pod \"30341025-0539-42a5-937b-58ba2187d3ed\" (UID: \"30341025-0539-42a5-937b-58ba2187d3ed\") " Mar 21 05:08:23 crc kubenswrapper[4580]: I0321 05:08:23.896139 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30341025-0539-42a5-937b-58ba2187d3ed-catalog-content\") pod \"30341025-0539-42a5-937b-58ba2187d3ed\" (UID: \"30341025-0539-42a5-937b-58ba2187d3ed\") " Mar 21 05:08:23 crc kubenswrapper[4580]: I0321 05:08:23.896218 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30341025-0539-42a5-937b-58ba2187d3ed-utilities\") pod \"30341025-0539-42a5-937b-58ba2187d3ed\" (UID: \"30341025-0539-42a5-937b-58ba2187d3ed\") " Mar 21 05:08:23 crc kubenswrapper[4580]: I0321 05:08:23.897162 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30341025-0539-42a5-937b-58ba2187d3ed-utilities" (OuterVolumeSpecName: "utilities") pod "30341025-0539-42a5-937b-58ba2187d3ed" (UID: "30341025-0539-42a5-937b-58ba2187d3ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:23 crc kubenswrapper[4580]: I0321 05:08:23.899374 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30341025-0539-42a5-937b-58ba2187d3ed-kube-api-access-flwms" (OuterVolumeSpecName: "kube-api-access-flwms") pod "30341025-0539-42a5-937b-58ba2187d3ed" (UID: "30341025-0539-42a5-937b-58ba2187d3ed"). InnerVolumeSpecName "kube-api-access-flwms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:23 crc kubenswrapper[4580]: I0321 05:08:23.946229 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30341025-0539-42a5-937b-58ba2187d3ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30341025-0539-42a5-937b-58ba2187d3ed" (UID: "30341025-0539-42a5-937b-58ba2187d3ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:23 crc kubenswrapper[4580]: I0321 05:08:23.998010 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flwms\" (UniqueName: \"kubernetes.io/projected/30341025-0539-42a5-937b-58ba2187d3ed-kube-api-access-flwms\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:23 crc kubenswrapper[4580]: I0321 05:08:23.998060 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30341025-0539-42a5-937b-58ba2187d3ed-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:23 crc kubenswrapper[4580]: I0321 05:08:23.998070 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30341025-0539-42a5-937b-58ba2187d3ed-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:24 crc kubenswrapper[4580]: I0321 05:08:24.069737 4580 generic.go:334] "Generic (PLEG): container finished" podID="30341025-0539-42a5-937b-58ba2187d3ed" containerID="1cc573521ef29abab4dbd645d89dda1974b2170a8e6cd9ea6e38f98cdeaaf433" exitCode=0 Mar 21 05:08:24 crc kubenswrapper[4580]: I0321 05:08:24.069794 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm88x" event={"ID":"30341025-0539-42a5-937b-58ba2187d3ed","Type":"ContainerDied","Data":"1cc573521ef29abab4dbd645d89dda1974b2170a8e6cd9ea6e38f98cdeaaf433"} Mar 21 05:08:24 crc kubenswrapper[4580]: I0321 05:08:24.069821 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm88x" Mar 21 05:08:24 crc kubenswrapper[4580]: I0321 05:08:24.069842 4580 scope.go:117] "RemoveContainer" containerID="1cc573521ef29abab4dbd645d89dda1974b2170a8e6cd9ea6e38f98cdeaaf433" Mar 21 05:08:24 crc kubenswrapper[4580]: I0321 05:08:24.069830 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm88x" event={"ID":"30341025-0539-42a5-937b-58ba2187d3ed","Type":"ContainerDied","Data":"c9f3f0796135f7ad131bfa8cae0c494cec3958c17459ce88020eb4acd67a91c9"} Mar 21 05:08:24 crc kubenswrapper[4580]: I0321 05:08:24.097009 4580 scope.go:117] "RemoveContainer" containerID="ea7e7b483ca55b363711497425ce34f8d6330f62c485d4ae08acbbcad56cb417" Mar 21 05:08:24 crc kubenswrapper[4580]: I0321 05:08:24.111120 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dm88x"] Mar 21 05:08:24 crc kubenswrapper[4580]: I0321 05:08:24.118408 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dm88x"] Mar 21 05:08:24 crc kubenswrapper[4580]: I0321 05:08:24.122027 4580 scope.go:117] "RemoveContainer" containerID="83fa8ee72bf09090f0fe8b26e76916d8852cb40e91e683de771c8c2e2c9d8e7b" Mar 21 05:08:24 crc kubenswrapper[4580]: I0321 05:08:24.139841 4580 scope.go:117] "RemoveContainer" containerID="1cc573521ef29abab4dbd645d89dda1974b2170a8e6cd9ea6e38f98cdeaaf433" Mar 21 05:08:24 crc kubenswrapper[4580]: E0321 05:08:24.140481 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cc573521ef29abab4dbd645d89dda1974b2170a8e6cd9ea6e38f98cdeaaf433\": container with ID starting with 1cc573521ef29abab4dbd645d89dda1974b2170a8e6cd9ea6e38f98cdeaaf433 not found: ID does not exist" containerID="1cc573521ef29abab4dbd645d89dda1974b2170a8e6cd9ea6e38f98cdeaaf433" Mar 21 05:08:24 crc kubenswrapper[4580]: I0321 05:08:24.140521 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc573521ef29abab4dbd645d89dda1974b2170a8e6cd9ea6e38f98cdeaaf433"} err="failed to get container status \"1cc573521ef29abab4dbd645d89dda1974b2170a8e6cd9ea6e38f98cdeaaf433\": rpc error: code = NotFound desc = could not find container \"1cc573521ef29abab4dbd645d89dda1974b2170a8e6cd9ea6e38f98cdeaaf433\": container with ID starting with 1cc573521ef29abab4dbd645d89dda1974b2170a8e6cd9ea6e38f98cdeaaf433 not found: ID does not exist" Mar 21 05:08:24 crc kubenswrapper[4580]: I0321 05:08:24.140545 4580 scope.go:117] "RemoveContainer" containerID="ea7e7b483ca55b363711497425ce34f8d6330f62c485d4ae08acbbcad56cb417" Mar 21 05:08:24 crc kubenswrapper[4580]: E0321 05:08:24.140960 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea7e7b483ca55b363711497425ce34f8d6330f62c485d4ae08acbbcad56cb417\": container with ID starting with ea7e7b483ca55b363711497425ce34f8d6330f62c485d4ae08acbbcad56cb417 not found: ID does not exist" containerID="ea7e7b483ca55b363711497425ce34f8d6330f62c485d4ae08acbbcad56cb417" Mar 21 05:08:24 crc kubenswrapper[4580]: I0321 05:08:24.140982 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7e7b483ca55b363711497425ce34f8d6330f62c485d4ae08acbbcad56cb417"} err="failed to get container status \"ea7e7b483ca55b363711497425ce34f8d6330f62c485d4ae08acbbcad56cb417\": rpc error: code = NotFound desc = could not find container \"ea7e7b483ca55b363711497425ce34f8d6330f62c485d4ae08acbbcad56cb417\": container with ID starting with ea7e7b483ca55b363711497425ce34f8d6330f62c485d4ae08acbbcad56cb417 not found: ID does not exist" Mar 21 05:08:24 crc kubenswrapper[4580]: I0321 05:08:24.140997 4580 scope.go:117] "RemoveContainer" containerID="83fa8ee72bf09090f0fe8b26e76916d8852cb40e91e683de771c8c2e2c9d8e7b" Mar 21 05:08:24 crc kubenswrapper[4580]: E0321 05:08:24.141308 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83fa8ee72bf09090f0fe8b26e76916d8852cb40e91e683de771c8c2e2c9d8e7b\": container with ID starting with 83fa8ee72bf09090f0fe8b26e76916d8852cb40e91e683de771c8c2e2c9d8e7b not found: ID does not exist" containerID="83fa8ee72bf09090f0fe8b26e76916d8852cb40e91e683de771c8c2e2c9d8e7b" Mar 21 05:08:24 crc kubenswrapper[4580]: I0321 05:08:24.141335 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83fa8ee72bf09090f0fe8b26e76916d8852cb40e91e683de771c8c2e2c9d8e7b"} err="failed to get container status \"83fa8ee72bf09090f0fe8b26e76916d8852cb40e91e683de771c8c2e2c9d8e7b\": rpc error: code = NotFound desc = could not find container \"83fa8ee72bf09090f0fe8b26e76916d8852cb40e91e683de771c8c2e2c9d8e7b\": container with ID starting with 83fa8ee72bf09090f0fe8b26e76916d8852cb40e91e683de771c8c2e2c9d8e7b not found: ID does not exist" Mar 21 05:08:25 crc kubenswrapper[4580]: I0321 05:08:25.631253 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30341025-0539-42a5-937b-58ba2187d3ed" path="/var/lib/kubelet/pods/30341025-0539-42a5-937b-58ba2187d3ed/volumes" Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.523769 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-68b88cfb78-cf92m"] Mar 21 05:08:28 crc kubenswrapper[4580]: E0321 05:08:28.524363 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1afb6781-0e31-4285-aac3-f6ad107c14e5" containerName="pull" Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.524379 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1afb6781-0e31-4285-aac3-f6ad107c14e5" containerName="pull" Mar 21 05:08:28 crc kubenswrapper[4580]: E0321 05:08:28.524399 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30341025-0539-42a5-937b-58ba2187d3ed" containerName="extract-content" Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.524407 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="30341025-0539-42a5-937b-58ba2187d3ed" containerName="extract-content" Mar 21 05:08:28 crc kubenswrapper[4580]: E0321 05:08:28.524420 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30341025-0539-42a5-937b-58ba2187d3ed" containerName="registry-server" Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.524428 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="30341025-0539-42a5-937b-58ba2187d3ed" containerName="registry-server" Mar 21 05:08:28 crc kubenswrapper[4580]: E0321 05:08:28.524438 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1afb6781-0e31-4285-aac3-f6ad107c14e5" containerName="extract" Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.524445 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1afb6781-0e31-4285-aac3-f6ad107c14e5" containerName="extract" Mar 21 05:08:28 crc kubenswrapper[4580]: E0321 05:08:28.524458 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30341025-0539-42a5-937b-58ba2187d3ed" containerName="extract-utilities" Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.524464 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="30341025-0539-42a5-937b-58ba2187d3ed" containerName="extract-utilities" Mar 21 05:08:28 crc kubenswrapper[4580]: E0321 05:08:28.524475 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1afb6781-0e31-4285-aac3-f6ad107c14e5" containerName="util" Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.524481 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1afb6781-0e31-4285-aac3-f6ad107c14e5" containerName="util" Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.524605 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="1afb6781-0e31-4285-aac3-f6ad107c14e5" containerName="extract" Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.524620 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="30341025-0539-42a5-937b-58ba2187d3ed" containerName="registry-server" Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.525098 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-68b88cfb78-cf92m" Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.527496 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-l7vmn" Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.562276 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-68b88cfb78-cf92m"] Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.656610 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv7h8\" (UniqueName: \"kubernetes.io/projected/d316a65b-0041-42cc-bf46-c6c8801c44a5-kube-api-access-jv7h8\") pod \"openstack-operator-controller-init-68b88cfb78-cf92m\" (UID: \"d316a65b-0041-42cc-bf46-c6c8801c44a5\") " pod="openstack-operators/openstack-operator-controller-init-68b88cfb78-cf92m" Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.757612 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv7h8\" (UniqueName: \"kubernetes.io/projected/d316a65b-0041-42cc-bf46-c6c8801c44a5-kube-api-access-jv7h8\") pod \"openstack-operator-controller-init-68b88cfb78-cf92m\" (UID: \"d316a65b-0041-42cc-bf46-c6c8801c44a5\") " pod="openstack-operators/openstack-operator-controller-init-68b88cfb78-cf92m" Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.786098 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv7h8\" (UniqueName: \"kubernetes.io/projected/d316a65b-0041-42cc-bf46-c6c8801c44a5-kube-api-access-jv7h8\") pod \"openstack-operator-controller-init-68b88cfb78-cf92m\" (UID: \"d316a65b-0041-42cc-bf46-c6c8801c44a5\") " pod="openstack-operators/openstack-operator-controller-init-68b88cfb78-cf92m" Mar 21 05:08:28 crc kubenswrapper[4580]: I0321 05:08:28.844691 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-68b88cfb78-cf92m" Mar 21 05:08:29 crc kubenswrapper[4580]: I0321 05:08:29.283742 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-68b88cfb78-cf92m"] Mar 21 05:08:30 crc kubenswrapper[4580]: I0321 05:08:30.120773 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-68b88cfb78-cf92m" event={"ID":"d316a65b-0041-42cc-bf46-c6c8801c44a5","Type":"ContainerStarted","Data":"4c8c72dce5efa63752ab70a310816fd4a26bfc379eea4d61331de2cfb44132ec"} Mar 21 05:08:34 crc kubenswrapper[4580]: I0321 05:08:34.145310 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-68b88cfb78-cf92m" event={"ID":"d316a65b-0041-42cc-bf46-c6c8801c44a5","Type":"ContainerStarted","Data":"19d9fdb57e9a628378270529c106f5a207a3eacb7908fd1fd11fa097f1d4b96f"} Mar 21 05:08:34 crc kubenswrapper[4580]: I0321 05:08:34.145919 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-68b88cfb78-cf92m" Mar 21 05:08:34 crc kubenswrapper[4580]: I0321 05:08:34.173380 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-68b88cfb78-cf92m" podStartSLOduration=1.843962956 podStartE2EDuration="6.173361761s" podCreationTimestamp="2026-03-21 05:08:28 +0000 UTC" firstStartedPulling="2026-03-21 05:08:29.302211398 +0000 UTC m=+1014.384795026" lastFinishedPulling="2026-03-21 05:08:33.631610193 +0000 UTC m=+1018.714193831" observedRunningTime="2026-03-21 05:08:34.171483531 +0000 UTC m=+1019.254067169" watchObservedRunningTime="2026-03-21 05:08:34.173361761 +0000 UTC m=+1019.255945399" Mar 21 05:08:38 crc kubenswrapper[4580]: I0321 05:08:38.847853 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-68b88cfb78-cf92m" Mar 21 05:08:45 crc kubenswrapper[4580]: I0321 05:08:45.584019 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7spcb"] Mar 21 05:08:45 crc kubenswrapper[4580]: I0321 05:08:45.585842 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:45 crc kubenswrapper[4580]: I0321 05:08:45.602219 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7spcb"] Mar 21 05:08:45 crc kubenswrapper[4580]: I0321 05:08:45.685083 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l29m6\" (UniqueName: \"kubernetes.io/projected/97d28b1b-fab2-4a70-88e6-a9d956721966-kube-api-access-l29m6\") pod \"redhat-marketplace-7spcb\" (UID: \"97d28b1b-fab2-4a70-88e6-a9d956721966\") " pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:45 crc kubenswrapper[4580]: I0321 05:08:45.685429 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97d28b1b-fab2-4a70-88e6-a9d956721966-catalog-content\") pod \"redhat-marketplace-7spcb\" (UID: \"97d28b1b-fab2-4a70-88e6-a9d956721966\") " pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:45 crc kubenswrapper[4580]: I0321 05:08:45.685453 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97d28b1b-fab2-4a70-88e6-a9d956721966-utilities\") pod \"redhat-marketplace-7spcb\" (UID: \"97d28b1b-fab2-4a70-88e6-a9d956721966\") " pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:45 crc kubenswrapper[4580]: I0321 05:08:45.786981 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97d28b1b-fab2-4a70-88e6-a9d956721966-catalog-content\") pod \"redhat-marketplace-7spcb\" (UID: \"97d28b1b-fab2-4a70-88e6-a9d956721966\") " pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:45 crc kubenswrapper[4580]: I0321 05:08:45.787022 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97d28b1b-fab2-4a70-88e6-a9d956721966-utilities\") pod \"redhat-marketplace-7spcb\" (UID: \"97d28b1b-fab2-4a70-88e6-a9d956721966\") " pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:45 crc kubenswrapper[4580]: I0321 05:08:45.787060 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l29m6\" (UniqueName: \"kubernetes.io/projected/97d28b1b-fab2-4a70-88e6-a9d956721966-kube-api-access-l29m6\") pod \"redhat-marketplace-7spcb\" (UID: \"97d28b1b-fab2-4a70-88e6-a9d956721966\") " pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:45 crc kubenswrapper[4580]: I0321 05:08:45.787948 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97d28b1b-fab2-4a70-88e6-a9d956721966-catalog-content\") pod \"redhat-marketplace-7spcb\" (UID: \"97d28b1b-fab2-4a70-88e6-a9d956721966\") " pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:45 crc kubenswrapper[4580]: I0321 05:08:45.788170 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97d28b1b-fab2-4a70-88e6-a9d956721966-utilities\") pod \"redhat-marketplace-7spcb\" (UID: \"97d28b1b-fab2-4a70-88e6-a9d956721966\") " pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:45 crc kubenswrapper[4580]: I0321 05:08:45.834769 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l29m6\" (UniqueName: \"kubernetes.io/projected/97d28b1b-fab2-4a70-88e6-a9d956721966-kube-api-access-l29m6\") pod \"redhat-marketplace-7spcb\" (UID: \"97d28b1b-fab2-4a70-88e6-a9d956721966\") " pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:45 crc kubenswrapper[4580]: I0321 05:08:45.902284 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:46 crc kubenswrapper[4580]: I0321 05:08:46.407026 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7spcb"] Mar 21 05:08:47 crc kubenswrapper[4580]: I0321 05:08:47.230118 4580 generic.go:334] "Generic (PLEG): container finished" podID="97d28b1b-fab2-4a70-88e6-a9d956721966" containerID="402a606a72ee541d3258ae8555d8ca72f394bfd8a48e9d215a9bbb933713c39b" exitCode=0 Mar 21 05:08:47 crc kubenswrapper[4580]: I0321 05:08:47.230222 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7spcb" event={"ID":"97d28b1b-fab2-4a70-88e6-a9d956721966","Type":"ContainerDied","Data":"402a606a72ee541d3258ae8555d8ca72f394bfd8a48e9d215a9bbb933713c39b"} Mar 21 05:08:47 crc kubenswrapper[4580]: I0321 05:08:47.230409 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7spcb" event={"ID":"97d28b1b-fab2-4a70-88e6-a9d956721966","Type":"ContainerStarted","Data":"52c5fb9d0817669c329b20f525b6dc26279d729a42d903f858b60575a75aae0d"} Mar 21 05:08:48 crc kubenswrapper[4580]: I0321 05:08:48.238579 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7spcb" event={"ID":"97d28b1b-fab2-4a70-88e6-a9d956721966","Type":"ContainerStarted","Data":"3e8d95e65a0bd4395fcf7b1cc1928118888f24222f7d9f3105f1e4ac634ac3c8"} Mar 21 05:08:49 crc kubenswrapper[4580]: I0321 05:08:49.246189 4580 generic.go:334] "Generic (PLEG): container finished" podID="97d28b1b-fab2-4a70-88e6-a9d956721966" containerID="3e8d95e65a0bd4395fcf7b1cc1928118888f24222f7d9f3105f1e4ac634ac3c8" exitCode=0 Mar 21 05:08:49 crc kubenswrapper[4580]: I0321 05:08:49.246297 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7spcb" event={"ID":"97d28b1b-fab2-4a70-88e6-a9d956721966","Type":"ContainerDied","Data":"3e8d95e65a0bd4395fcf7b1cc1928118888f24222f7d9f3105f1e4ac634ac3c8"} Mar 21 05:08:50 crc kubenswrapper[4580]: I0321 05:08:50.254489 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7spcb" event={"ID":"97d28b1b-fab2-4a70-88e6-a9d956721966","Type":"ContainerStarted","Data":"416bb501bce96a3ce01955abf410ba60e297d73666a5516fad437ec46ca0aa6d"} Mar 21 05:08:52 crc kubenswrapper[4580]: I0321 05:08:52.000084 4580 scope.go:117] "RemoveContainer" containerID="4a5bab3815b9de316e26b637338530acea669073a0d814e9ade3daa248c0aacb" Mar 21 05:08:55 crc kubenswrapper[4580]: I0321 05:08:55.904078 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:55 crc kubenswrapper[4580]: I0321 05:08:55.904704 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:55 crc kubenswrapper[4580]: I0321 05:08:55.943119 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:55 crc kubenswrapper[4580]: I0321 05:08:55.961040 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7spcb" podStartSLOduration=8.548169051 podStartE2EDuration="10.961023941s" podCreationTimestamp="2026-03-21 05:08:45 +0000 UTC" firstStartedPulling="2026-03-21 05:08:47.23228991 +0000 UTC m=+1032.314873538" lastFinishedPulling="2026-03-21 05:08:49.6451448 +0000 UTC m=+1034.727728428" observedRunningTime="2026-03-21 05:08:50.302483155 +0000 UTC m=+1035.385066813" watchObservedRunningTime="2026-03-21 05:08:55.961023941 +0000 UTC m=+1041.043607569" Mar 21 05:08:56 crc kubenswrapper[4580]: I0321 05:08:56.328393 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:56 crc kubenswrapper[4580]: I0321 05:08:56.366609 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7spcb"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.146750 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-6h2nr"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.147731 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-6h2nr" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.151774 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-p9ksm" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.169767 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-vxzhk"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.170528 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-vxzhk" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.172655 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-ttvpc" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.176711 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-6h2nr"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.199746 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-vxzhk"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.204470 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-lkjq4"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.205165 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-lkjq4" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.206764 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-jz4l5" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.245163 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-lkjq4"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.251750 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnwz8\" (UniqueName: \"kubernetes.io/projected/d3c84591-dfcf-48e6-a022-25562660675e-kube-api-access-wnwz8\") pod \"barbican-operator-controller-manager-5cfd84c587-6h2nr\" (UID: \"d3c84591-dfcf-48e6-a022-25562660675e\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-6h2nr" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.257149 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-gqlhn"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.257979 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-gqlhn" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.262245 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-g6t7s" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.297684 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7spcb" podUID="97d28b1b-fab2-4a70-88e6-a9d956721966" containerName="registry-server" containerID="cri-o://416bb501bce96a3ce01955abf410ba60e297d73666a5516fad437ec46ca0aa6d" gracePeriod=2 Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.299411 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-kw6px"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.300340 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-kw6px" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.307211 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-rxw92" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.331730 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-gqlhn"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.354486 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-lk8d6"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.362223 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7g95\" (UniqueName: \"kubernetes.io/projected/5522a0a6-b385-4bf6-990c-5a07561257b0-kube-api-access-q7g95\") pod \"cinder-operator-controller-manager-6d77645966-vxzhk\" (UID: \"5522a0a6-b385-4bf6-990c-5a07561257b0\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-vxzhk" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.362298 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxlxc\" (UniqueName: \"kubernetes.io/projected/a96026f1-4dcb-483a-83da-aecc72e7590c-kube-api-access-vxlxc\") pod \"designate-operator-controller-manager-6cc65c69fc-lkjq4\" (UID: \"a96026f1-4dcb-483a-83da-aecc72e7590c\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-lkjq4" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.362402 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpmwr\" (UniqueName: \"kubernetes.io/projected/fad28507-ca7b-4452-b392-f0b68e1f9d64-kube-api-access-bpmwr\") pod \"glance-operator-controller-manager-7d559dcdbd-gqlhn\" (UID: \"fad28507-ca7b-4452-b392-f0b68e1f9d64\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-gqlhn" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.362448 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnwz8\" (UniqueName: \"kubernetes.io/projected/d3c84591-dfcf-48e6-a022-25562660675e-kube-api-access-wnwz8\") pod \"barbican-operator-controller-manager-5cfd84c587-6h2nr\" (UID: \"d3c84591-dfcf-48e6-a022-25562660675e\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-6h2nr" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.365195 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-lk8d6" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.396087 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qw25q" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.396256 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-lk8d6"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.426895 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnwz8\" (UniqueName: \"kubernetes.io/projected/d3c84591-dfcf-48e6-a022-25562660675e-kube-api-access-wnwz8\") pod \"barbican-operator-controller-manager-5cfd84c587-6h2nr\" (UID: \"d3c84591-dfcf-48e6-a022-25562660675e\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-6h2nr" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.446940 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.447767 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.449170 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-kw6px"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.453263 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-gpl9d" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.453486 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.453595 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-kmqjb"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.454550 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-kmqjb" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.463529 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7g95\" (UniqueName: \"kubernetes.io/projected/5522a0a6-b385-4bf6-990c-5a07561257b0-kube-api-access-q7g95\") pod \"cinder-operator-controller-manager-6d77645966-vxzhk\" (UID: \"5522a0a6-b385-4bf6-990c-5a07561257b0\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-vxzhk" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.463588 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxlxc\" (UniqueName: \"kubernetes.io/projected/a96026f1-4dcb-483a-83da-aecc72e7590c-kube-api-access-vxlxc\") pod \"designate-operator-controller-manager-6cc65c69fc-lkjq4\" (UID: \"a96026f1-4dcb-483a-83da-aecc72e7590c\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-lkjq4" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.463643 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpmwr\" (UniqueName: \"kubernetes.io/projected/fad28507-ca7b-4452-b392-f0b68e1f9d64-kube-api-access-bpmwr\") pod \"glance-operator-controller-manager-7d559dcdbd-gqlhn\" (UID: \"fad28507-ca7b-4452-b392-f0b68e1f9d64\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-gqlhn" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.463683 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2vf6\" (UniqueName: \"kubernetes.io/projected/127bc03d-748e-4919-97f8-6f66ab3e2a8a-kube-api-access-s2vf6\") pod \"horizon-operator-controller-manager-64dc66d669-lk8d6\" (UID: \"127bc03d-748e-4919-97f8-6f66ab3e2a8a\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-lk8d6" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.463735 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjq94\" (UniqueName: \"kubernetes.io/projected/4faee52b-73ab-41d7-a319-33eb67e1aa30-kube-api-access-vjq94\") pod \"heat-operator-controller-manager-66dd9d474d-kw6px\" (UID: \"4faee52b-73ab-41d7-a319-33eb67e1aa30\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-kw6px" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.471855 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-9v59d" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.472265 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-6h2nr" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.483523 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.503135 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-kmqjb"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.513906 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-784c64596-vdvhl"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.514913 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-784c64596-vdvhl" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.515801 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpmwr\" (UniqueName: \"kubernetes.io/projected/fad28507-ca7b-4452-b392-f0b68e1f9d64-kube-api-access-bpmwr\") pod \"glance-operator-controller-manager-7d559dcdbd-gqlhn\" (UID: \"fad28507-ca7b-4452-b392-f0b68e1f9d64\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-gqlhn" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.516583 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7g95\" (UniqueName: \"kubernetes.io/projected/5522a0a6-b385-4bf6-990c-5a07561257b0-kube-api-access-q7g95\") pod \"cinder-operator-controller-manager-6d77645966-vxzhk\" (UID: \"5522a0a6-b385-4bf6-990c-5a07561257b0\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-vxzhk" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.528343 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-79lhc" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.529394 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxlxc\" (UniqueName: \"kubernetes.io/projected/a96026f1-4dcb-483a-83da-aecc72e7590c-kube-api-access-vxlxc\") pod \"designate-operator-controller-manager-6cc65c69fc-lkjq4\" (UID: \"a96026f1-4dcb-483a-83da-aecc72e7590c\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-lkjq4" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.536301 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-784c64596-vdvhl"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.544092 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-mn29l"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.545063 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-mn29l" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.552227 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-s8trb" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.569061 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs75w\" (UniqueName: \"kubernetes.io/projected/b56378b1-33b0-4032-a383-49163ca1811d-kube-api-access-qs75w\") pod \"infra-operator-controller-manager-5595c7d6ff-nd42d\" (UID: \"b56378b1-33b0-4032-a383-49163ca1811d\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.569360 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2vf6\" (UniqueName: \"kubernetes.io/projected/127bc03d-748e-4919-97f8-6f66ab3e2a8a-kube-api-access-s2vf6\") pod \"horizon-operator-controller-manager-64dc66d669-lk8d6\" (UID: \"127bc03d-748e-4919-97f8-6f66ab3e2a8a\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-lk8d6" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.569487 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-nd42d\" (UID: \"b56378b1-33b0-4032-a383-49163ca1811d\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.569597 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjq94\" (UniqueName: \"kubernetes.io/projected/4faee52b-73ab-41d7-a319-33eb67e1aa30-kube-api-access-vjq94\") pod \"heat-operator-controller-manager-66dd9d474d-kw6px\" (UID: \"4faee52b-73ab-41d7-a319-33eb67e1aa30\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-kw6px" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.569691 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7gkq\" (UniqueName: \"kubernetes.io/projected/21248f61-caf9-4660-8299-3b10368fa8ad-kube-api-access-b7gkq\") pod \"ironic-operator-controller-manager-6b77b7676d-kmqjb\" (UID: \"21248f61-caf9-4660-8299-3b10368fa8ad\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-kmqjb" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.557900 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-mn29l"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.583547 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-gqlhn" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.613849 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-7dkgm"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.614947 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-7dkgm" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.617517 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-5mkg5" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.618757 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjq94\" (UniqueName: \"kubernetes.io/projected/4faee52b-73ab-41d7-a319-33eb67e1aa30-kube-api-access-vjq94\") pod \"heat-operator-controller-manager-66dd9d474d-kw6px\" (UID: \"4faee52b-73ab-41d7-a319-33eb67e1aa30\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-kw6px" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.637065 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-kw6px" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.643520 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2vf6\" (UniqueName: \"kubernetes.io/projected/127bc03d-748e-4919-97f8-6f66ab3e2a8a-kube-api-access-s2vf6\") pod \"horizon-operator-controller-manager-64dc66d669-lk8d6\" (UID: \"127bc03d-748e-4919-97f8-6f66ab3e2a8a\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-lk8d6" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.670432 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs75w\" (UniqueName: \"kubernetes.io/projected/b56378b1-33b0-4032-a383-49163ca1811d-kube-api-access-qs75w\") pod \"infra-operator-controller-manager-5595c7d6ff-nd42d\" (UID: \"b56378b1-33b0-4032-a383-49163ca1811d\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.670481 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcgfw\" (UniqueName: \"kubernetes.io/projected/02dbd40b-11b9-4fca-9617-72b7be489626-kube-api-access-tcgfw\") pod \"keystone-operator-controller-manager-784c64596-vdvhl\" (UID: \"02dbd40b-11b9-4fca-9617-72b7be489626\") " pod="openstack-operators/keystone-operator-controller-manager-784c64596-vdvhl" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.670532 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-nd42d\" (UID: \"b56378b1-33b0-4032-a383-49163ca1811d\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.670555 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7gkq\" (UniqueName: \"kubernetes.io/projected/21248f61-caf9-4660-8299-3b10368fa8ad-kube-api-access-b7gkq\") pod \"ironic-operator-controller-manager-6b77b7676d-kmqjb\" (UID: \"21248f61-caf9-4660-8299-3b10368fa8ad\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-kmqjb" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.670586 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fldrb\" (UniqueName: \"kubernetes.io/projected/d45ead43-2f4d-46fc-857f-7e6dbb3e08f6-kube-api-access-fldrb\") pod \"manila-operator-controller-manager-fbf7bbb96-mn29l\" (UID: \"d45ead43-2f4d-46fc-857f-7e6dbb3e08f6\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-mn29l" Mar 21 05:08:58 crc kubenswrapper[4580]: E0321 05:08:58.672020 4580 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 05:08:58 crc kubenswrapper[4580]: E0321 05:08:58.672108 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert podName:b56378b1-33b0-4032-a383-49163ca1811d nodeName:}" failed. No retries permitted until 2026-03-21 05:08:59.172062667 +0000 UTC m=+1044.254646335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert") pod "infra-operator-controller-manager-5595c7d6ff-nd42d" (UID: "b56378b1-33b0-4032-a383-49163ca1811d") : secret "infra-operator-webhook-server-cert" not found Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.677492 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.678380 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.682272 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7lkkm" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.686689 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-7dkgm"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.699312 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.700360 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.703193 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-j8kpc" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.712523 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs75w\" (UniqueName: \"kubernetes.io/projected/b56378b1-33b0-4032-a383-49163ca1811d-kube-api-access-qs75w\") pod \"infra-operator-controller-manager-5595c7d6ff-nd42d\" (UID: \"b56378b1-33b0-4032-a383-49163ca1811d\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.748727 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.751874 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-2h95r"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.755890 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-2h95r" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.757085 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7gkq\" (UniqueName: \"kubernetes.io/projected/21248f61-caf9-4660-8299-3b10368fa8ad-kube-api-access-b7gkq\") pod \"ironic-operator-controller-manager-6b77b7676d-kmqjb\" (UID: \"21248f61-caf9-4660-8299-3b10368fa8ad\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-kmqjb" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.761408 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-kx5dr" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.771188 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgctk\" (UniqueName: \"kubernetes.io/projected/2535b22b-0bed-4ffd-9430-ca9fb3230c62-kube-api-access-lgctk\") pod \"nova-operator-controller-manager-bc5c78db9-d5jll\" (UID: \"2535b22b-0bed-4ffd-9430-ca9fb3230c62\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.771242 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcgfw\" (UniqueName: \"kubernetes.io/projected/02dbd40b-11b9-4fca-9617-72b7be489626-kube-api-access-tcgfw\") pod \"keystone-operator-controller-manager-784c64596-vdvhl\" (UID: \"02dbd40b-11b9-4fca-9617-72b7be489626\") " pod="openstack-operators/keystone-operator-controller-manager-784c64596-vdvhl" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.771281 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rptqv\" (UniqueName: \"kubernetes.io/projected/6f7dea10-53e8-4c25-87bc-ffd154d4cb7d-kube-api-access-rptqv\") pod \"neutron-operator-controller-manager-6744dd545c-sdpcs\" (UID: \"6f7dea10-53e8-4c25-87bc-ffd154d4cb7d\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.771324 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgfsz\" (UniqueName: \"kubernetes.io/projected/a216d106-9a69-4143-8766-4e505f2b5a8f-kube-api-access-qgfsz\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-7dkgm\" (UID: \"a216d106-9a69-4143-8766-4e505f2b5a8f\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-7dkgm" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.771350 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fldrb\" (UniqueName: \"kubernetes.io/projected/d45ead43-2f4d-46fc-857f-7e6dbb3e08f6-kube-api-access-fldrb\") pod \"manila-operator-controller-manager-fbf7bbb96-mn29l\" (UID: \"d45ead43-2f4d-46fc-857f-7e6dbb3e08f6\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-mn29l" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.773840 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.793657 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-vxzhk" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.799964 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.800772 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.822354 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.823839 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-vq8s7" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.829017 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-lkjq4" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.830182 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-lk8d6" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.834475 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-kg5cb"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.835310 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-kg5cb" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.858566 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-fvt5d" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.858760 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcgfw\" (UniqueName: \"kubernetes.io/projected/02dbd40b-11b9-4fca-9617-72b7be489626-kube-api-access-tcgfw\") pod \"keystone-operator-controller-manager-784c64596-vdvhl\" (UID: \"02dbd40b-11b9-4fca-9617-72b7be489626\") " pod="openstack-operators/keystone-operator-controller-manager-784c64596-vdvhl" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.866207 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.867173 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.874369 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgctk\" (UniqueName: \"kubernetes.io/projected/2535b22b-0bed-4ffd-9430-ca9fb3230c62-kube-api-access-lgctk\") pod \"nova-operator-controller-manager-bc5c78db9-d5jll\" (UID: \"2535b22b-0bed-4ffd-9430-ca9fb3230c62\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.874415 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v\" (UID: \"2e366c15-abc4-4e05-9054-cd7828e00059\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.874454 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rptqv\" (UniqueName: \"kubernetes.io/projected/6f7dea10-53e8-4c25-87bc-ffd154d4cb7d-kube-api-access-rptqv\") pod \"neutron-operator-controller-manager-6744dd545c-sdpcs\" (UID: \"6f7dea10-53e8-4c25-87bc-ffd154d4cb7d\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.874501 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqj75\" (UniqueName: \"kubernetes.io/projected/67d1d125-57c7-4c30-a51a-24db28fb4818-kube-api-access-xqj75\") pod \"octavia-operator-controller-manager-56f74467c6-2h95r\" (UID: \"67d1d125-57c7-4c30-a51a-24db28fb4818\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-2h95r" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.874542 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgfsz\" (UniqueName: \"kubernetes.io/projected/a216d106-9a69-4143-8766-4e505f2b5a8f-kube-api-access-qgfsz\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-7dkgm\" (UID: \"a216d106-9a69-4143-8766-4e505f2b5a8f\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-7dkgm" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.874566 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsnbm\" (UniqueName: \"kubernetes.io/projected/2e366c15-abc4-4e05-9054-cd7828e00059-kube-api-access-gsnbm\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v\" (UID: \"2e366c15-abc4-4e05-9054-cd7828e00059\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.878619 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fldrb\" (UniqueName: \"kubernetes.io/projected/d45ead43-2f4d-46fc-857f-7e6dbb3e08f6-kube-api-access-fldrb\") pod \"manila-operator-controller-manager-fbf7bbb96-mn29l\" (UID: \"d45ead43-2f4d-46fc-857f-7e6dbb3e08f6\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-mn29l" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.882118 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-2h95r"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.892518 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-k65gr" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.929680 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.937079 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgctk\" (UniqueName: \"kubernetes.io/projected/2535b22b-0bed-4ffd-9430-ca9fb3230c62-kube-api-access-lgctk\") pod \"nova-operator-controller-manager-bc5c78db9-d5jll\" (UID: \"2535b22b-0bed-4ffd-9430-ca9fb3230c62\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.938573 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgfsz\" (UniqueName: \"kubernetes.io/projected/a216d106-9a69-4143-8766-4e505f2b5a8f-kube-api-access-qgfsz\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-7dkgm\" (UID: \"a216d106-9a69-4143-8766-4e505f2b5a8f\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-7dkgm" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.948174 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rptqv\" (UniqueName: \"kubernetes.io/projected/6f7dea10-53e8-4c25-87bc-ffd154d4cb7d-kube-api-access-rptqv\") pod \"neutron-operator-controller-manager-6744dd545c-sdpcs\" (UID: \"6f7dea10-53e8-4c25-87bc-ffd154d4cb7d\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.954125 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-kmqjb" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.974125 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-784c64596-vdvhl" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.977182 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v\" (UID: \"2e366c15-abc4-4e05-9054-cd7828e00059\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.977238 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5rzs\" (UniqueName: \"kubernetes.io/projected/0e31c4f0-9b9d-4c10-84de-d15718775f9a-kube-api-access-p5rzs\") pod \"placement-operator-controller-manager-659fb58c6b-ln984\" (UID: \"0e31c4f0-9b9d-4c10-84de-d15718775f9a\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.977265 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqf58\" (UniqueName: \"kubernetes.io/projected/bd6a540f-0a1b-4098-8573-b9049d52f49b-kube-api-access-tqf58\") pod \"ovn-operator-controller-manager-846c4cdcb7-kg5cb\" (UID: \"bd6a540f-0a1b-4098-8573-b9049d52f49b\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-kg5cb" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.977288 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqj75\" (UniqueName: \"kubernetes.io/projected/67d1d125-57c7-4c30-a51a-24db28fb4818-kube-api-access-xqj75\") pod \"octavia-operator-controller-manager-56f74467c6-2h95r\" (UID: \"67d1d125-57c7-4c30-a51a-24db28fb4818\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-2h95r" Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.977341 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsnbm\" (UniqueName: \"kubernetes.io/projected/2e366c15-abc4-4e05-9054-cd7828e00059-kube-api-access-gsnbm\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v\" (UID: \"2e366c15-abc4-4e05-9054-cd7828e00059\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:08:58 crc kubenswrapper[4580]: E0321 05:08:58.977737 4580 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.983828 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984"] Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.984092 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-mn29l" Mar 21 05:08:58 crc kubenswrapper[4580]: E0321 05:08:58.988846 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert podName:2e366c15-abc4-4e05-9054-cd7828e00059 nodeName:}" failed. No retries permitted until 2026-03-21 05:08:59.477772822 +0000 UTC m=+1044.560356450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" (UID: "2e366c15-abc4-4e05-9054-cd7828e00059") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:08:58 crc kubenswrapper[4580]: I0321 05:08:58.996248 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-7dkgm" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.018365 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsnbm\" (UniqueName: \"kubernetes.io/projected/2e366c15-abc4-4e05-9054-cd7828e00059-kube-api-access-gsnbm\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v\" (UID: \"2e366c15-abc4-4e05-9054-cd7828e00059\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.018374 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqj75\" (UniqueName: \"kubernetes.io/projected/67d1d125-57c7-4c30-a51a-24db28fb4818-kube-api-access-xqj75\") pod \"octavia-operator-controller-manager-56f74467c6-2h95r\" (UID: \"67d1d125-57c7-4c30-a51a-24db28fb4818\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-2h95r" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.018638 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.026819 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-kg5cb"] Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.061747 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.088159 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5rzs\" (UniqueName: \"kubernetes.io/projected/0e31c4f0-9b9d-4c10-84de-d15718775f9a-kube-api-access-p5rzs\") pod \"placement-operator-controller-manager-659fb58c6b-ln984\" (UID: \"0e31c4f0-9b9d-4c10-84de-d15718775f9a\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.088223 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqf58\" (UniqueName: \"kubernetes.io/projected/bd6a540f-0a1b-4098-8573-b9049d52f49b-kube-api-access-tqf58\") pod \"ovn-operator-controller-manager-846c4cdcb7-kg5cb\" (UID: \"bd6a540f-0a1b-4098-8573-b9049d52f49b\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-kg5cb" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.112929 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-2h95r" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.161139 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqf58\" (UniqueName: \"kubernetes.io/projected/bd6a540f-0a1b-4098-8573-b9049d52f49b-kube-api-access-tqf58\") pod \"ovn-operator-controller-manager-846c4cdcb7-kg5cb\" (UID: \"bd6a540f-0a1b-4098-8573-b9049d52f49b\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-kg5cb" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.174875 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5rzs\" (UniqueName: \"kubernetes.io/projected/0e31c4f0-9b9d-4c10-84de-d15718775f9a-kube-api-access-p5rzs\") pod \"placement-operator-controller-manager-659fb58c6b-ln984\" (UID: \"0e31c4f0-9b9d-4c10-84de-d15718775f9a\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.197369 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-kg5cb" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.198169 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-nd42d\" (UID: \"b56378b1-33b0-4032-a383-49163ca1811d\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:08:59 crc kubenswrapper[4580]: E0321 05:08:59.198480 4580 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 05:08:59 crc kubenswrapper[4580]: E0321 05:08:59.198546 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert podName:b56378b1-33b0-4032-a383-49163ca1811d nodeName:}" failed. No retries permitted until 2026-03-21 05:09:00.19852051 +0000 UTC m=+1045.281104138 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert") pod "infra-operator-controller-manager-5595c7d6ff-nd42d" (UID: "b56378b1-33b0-4032-a383-49163ca1811d") : secret "infra-operator-webhook-server-cert" not found Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.199573 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5"] Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.218723 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.223873 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-k6znr" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.247962 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx"] Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.249163 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.249871 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.252592 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-n8nns" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.262818 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v"] Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.263687 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.281389 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-9jb5n" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.300009 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp56x\" (UniqueName: \"kubernetes.io/projected/5992ccfd-4585-49b2-84ff-3f1fe6812a82-kube-api-access-zp56x\") pod \"test-operator-controller-manager-8467ccb4c8-rqw4v\" (UID: \"5992ccfd-4585-49b2-84ff-3f1fe6812a82\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.300067 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ghx2\" (UniqueName: \"kubernetes.io/projected/2a0c721d-68cd-46de-8292-6bd8373e1106-kube-api-access-6ghx2\") pod \"telemetry-operator-controller-manager-6d84559f47-g52cx\" (UID: \"2a0c721d-68cd-46de-8292-6bd8373e1106\") " pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.300086 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxzwh\" (UniqueName: \"kubernetes.io/projected/8b025526-696f-4d7d-82ed-df03050fa1fd-kube-api-access-cxzwh\") pod \"swift-operator-controller-manager-867f54bc44-78zl5\" (UID: \"8b025526-696f-4d7d-82ed-df03050fa1fd\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.304418 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5"] Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.321543 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx"] Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.345190 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v"] Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.401276 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-q8jxn"] Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.402189 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-q8jxn" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.403744 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp56x\" (UniqueName: \"kubernetes.io/projected/5992ccfd-4585-49b2-84ff-3f1fe6812a82-kube-api-access-zp56x\") pod \"test-operator-controller-manager-8467ccb4c8-rqw4v\" (UID: \"5992ccfd-4585-49b2-84ff-3f1fe6812a82\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.406943 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ghx2\" (UniqueName: \"kubernetes.io/projected/2a0c721d-68cd-46de-8292-6bd8373e1106-kube-api-access-6ghx2\") pod \"telemetry-operator-controller-manager-6d84559f47-g52cx\" (UID: \"2a0c721d-68cd-46de-8292-6bd8373e1106\") " pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.406985 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxzwh\" (UniqueName: \"kubernetes.io/projected/8b025526-696f-4d7d-82ed-df03050fa1fd-kube-api-access-cxzwh\") pod \"swift-operator-controller-manager-867f54bc44-78zl5\" (UID: \"8b025526-696f-4d7d-82ed-df03050fa1fd\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.408181 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-mbg8p" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.417397 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-q8jxn"] Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.444324 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w"] Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.444888 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ghx2\" (UniqueName: \"kubernetes.io/projected/2a0c721d-68cd-46de-8292-6bd8373e1106-kube-api-access-6ghx2\") pod \"telemetry-operator-controller-manager-6d84559f47-g52cx\" (UID: \"2a0c721d-68cd-46de-8292-6bd8373e1106\") " pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.445189 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.446614 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w"] Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.451529 4580 generic.go:334] "Generic (PLEG): container finished" podID="97d28b1b-fab2-4a70-88e6-a9d956721966" containerID="416bb501bce96a3ce01955abf410ba60e297d73666a5516fad437ec46ca0aa6d" exitCode=0 Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.451570 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7spcb" event={"ID":"97d28b1b-fab2-4a70-88e6-a9d956721966","Type":"ContainerDied","Data":"416bb501bce96a3ce01955abf410ba60e297d73666a5516fad437ec46ca0aa6d"} Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.451931 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxzwh\" (UniqueName: \"kubernetes.io/projected/8b025526-696f-4d7d-82ed-df03050fa1fd-kube-api-access-cxzwh\") pod \"swift-operator-controller-manager-867f54bc44-78zl5\" (UID: \"8b025526-696f-4d7d-82ed-df03050fa1fd\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.452199 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-kzhzm" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.456381 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.456602 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.511917 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.511969 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.512009 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v\" (UID: \"2e366c15-abc4-4e05-9054-cd7828e00059\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.512031 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkbb9\" (UniqueName: \"kubernetes.io/projected/6080f6a7-a68f-447a-bedd-182cd69337b5-kube-api-access-bkbb9\") pod \"watcher-operator-controller-manager-74d6f7b5c-q8jxn\" (UID: \"6080f6a7-a68f-447a-bedd-182cd69337b5\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-q8jxn" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.512048 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsjhg\" (UniqueName: \"kubernetes.io/projected/19a22b87-c6f3-4020-aa11-2a940041f49c-kube-api-access-rsjhg\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:08:59 crc kubenswrapper[4580]: E0321 05:08:59.512170 4580 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:08:59 crc kubenswrapper[4580]: E0321 05:08:59.512208 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert podName:2e366c15-abc4-4e05-9054-cd7828e00059 nodeName:}" failed. No retries permitted until 2026-03-21 05:09:00.512195156 +0000 UTC m=+1045.594778784 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" (UID: "2e366c15-abc4-4e05-9054-cd7828e00059") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.513671 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx"] Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.514545 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.518663 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp56x\" (UniqueName: \"kubernetes.io/projected/5992ccfd-4585-49b2-84ff-3f1fe6812a82-kube-api-access-zp56x\") pod \"test-operator-controller-manager-8467ccb4c8-rqw4v\" (UID: \"5992ccfd-4585-49b2-84ff-3f1fe6812a82\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.520043 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-jrlj9" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.531611 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx"] Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.600254 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.613834 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkbb9\" (UniqueName: \"kubernetes.io/projected/6080f6a7-a68f-447a-bedd-182cd69337b5-kube-api-access-bkbb9\") pod \"watcher-operator-controller-manager-74d6f7b5c-q8jxn\" (UID: \"6080f6a7-a68f-447a-bedd-182cd69337b5\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-q8jxn" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.613871 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsjhg\" (UniqueName: \"kubernetes.io/projected/19a22b87-c6f3-4020-aa11-2a940041f49c-kube-api-access-rsjhg\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.613950 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqmkk\" (UniqueName: \"kubernetes.io/projected/92286fdb-b69e-4028-8a93-3517469a731c-kube-api-access-fqmkk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5mdkx\" (UID: \"92286fdb-b69e-4028-8a93-3517469a731c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.613974 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.614639 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:08:59 crc kubenswrapper[4580]: E0321 05:08:59.614388 4580 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 05:08:59 crc kubenswrapper[4580]: E0321 05:08:59.614833 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs podName:19a22b87-c6f3-4020-aa11-2a940041f49c nodeName:}" failed. No retries permitted until 2026-03-21 05:09:00.11481602 +0000 UTC m=+1045.197399648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs") pod "openstack-operator-controller-manager-684bbdfff8-7nr7w" (UID: "19a22b87-c6f3-4020-aa11-2a940041f49c") : secret "webhook-server-cert" not found Mar 21 05:08:59 crc kubenswrapper[4580]: E0321 05:08:59.614768 4580 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 05:08:59 crc kubenswrapper[4580]: E0321 05:08:59.615058 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs podName:19a22b87-c6f3-4020-aa11-2a940041f49c nodeName:}" failed. No retries permitted until 2026-03-21 05:09:00.115051136 +0000 UTC m=+1045.197634754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs") pod "openstack-operator-controller-manager-684bbdfff8-7nr7w" (UID: "19a22b87-c6f3-4020-aa11-2a940041f49c") : secret "metrics-server-cert" not found Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.620394 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.636608 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.638519 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsjhg\" (UniqueName: \"kubernetes.io/projected/19a22b87-c6f3-4020-aa11-2a940041f49c-kube-api-access-rsjhg\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.652272 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkbb9\" (UniqueName: \"kubernetes.io/projected/6080f6a7-a68f-447a-bedd-182cd69337b5-kube-api-access-bkbb9\") pod \"watcher-operator-controller-manager-74d6f7b5c-q8jxn\" (UID: \"6080f6a7-a68f-447a-bedd-182cd69337b5\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-q8jxn" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.676737 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.722936 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l29m6\" (UniqueName: \"kubernetes.io/projected/97d28b1b-fab2-4a70-88e6-a9d956721966-kube-api-access-l29m6\") pod \"97d28b1b-fab2-4a70-88e6-a9d956721966\" (UID: \"97d28b1b-fab2-4a70-88e6-a9d956721966\") " Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.723011 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97d28b1b-fab2-4a70-88e6-a9d956721966-utilities\") pod \"97d28b1b-fab2-4a70-88e6-a9d956721966\" (UID: \"97d28b1b-fab2-4a70-88e6-a9d956721966\") " Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.723075 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97d28b1b-fab2-4a70-88e6-a9d956721966-catalog-content\") pod \"97d28b1b-fab2-4a70-88e6-a9d956721966\" (UID: \"97d28b1b-fab2-4a70-88e6-a9d956721966\") " Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.727429 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqmkk\" (UniqueName: \"kubernetes.io/projected/92286fdb-b69e-4028-8a93-3517469a731c-kube-api-access-fqmkk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5mdkx\" (UID: \"92286fdb-b69e-4028-8a93-3517469a731c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.746306 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97d28b1b-fab2-4a70-88e6-a9d956721966-utilities" (OuterVolumeSpecName: "utilities") pod "97d28b1b-fab2-4a70-88e6-a9d956721966" (UID: "97d28b1b-fab2-4a70-88e6-a9d956721966"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.746582 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97d28b1b-fab2-4a70-88e6-a9d956721966-kube-api-access-l29m6" (OuterVolumeSpecName: "kube-api-access-l29m6") pod "97d28b1b-fab2-4a70-88e6-a9d956721966" (UID: "97d28b1b-fab2-4a70-88e6-a9d956721966"). InnerVolumeSpecName "kube-api-access-l29m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.764623 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqmkk\" (UniqueName: \"kubernetes.io/projected/92286fdb-b69e-4028-8a93-3517469a731c-kube-api-access-fqmkk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5mdkx\" (UID: \"92286fdb-b69e-4028-8a93-3517469a731c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.791293 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97d28b1b-fab2-4a70-88e6-a9d956721966-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97d28b1b-fab2-4a70-88e6-a9d956721966" (UID: "97d28b1b-fab2-4a70-88e6-a9d956721966"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.829452 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l29m6\" (UniqueName: \"kubernetes.io/projected/97d28b1b-fab2-4a70-88e6-a9d956721966-kube-api-access-l29m6\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.829485 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97d28b1b-fab2-4a70-88e6-a9d956721966-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.829498 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97d28b1b-fab2-4a70-88e6-a9d956721966-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.940884 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-q8jxn" Mar 21 05:08:59 crc kubenswrapper[4580]: I0321 05:08:59.958395 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-6h2nr"] Mar 21 05:09:00 crc kubenswrapper[4580]: W0321 05:09:00.001998 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3c84591_dfcf_48e6_a022_25562660675e.slice/crio-d28a415a704d2f04f199b219eaae3f20d152b7cb2dc6185b0b30d8447595270a WatchSource:0}: Error finding container d28a415a704d2f04f199b219eaae3f20d152b7cb2dc6185b0b30d8447595270a: Status 404 returned error can't find the container with id d28a415a704d2f04f199b219eaae3f20d152b7cb2dc6185b0b30d8447595270a Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.007502 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx" Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.137974 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.138087 4580 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.138457 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs podName:19a22b87-c6f3-4020-aa11-2a940041f49c nodeName:}" failed. No retries permitted until 2026-03-21 05:09:01.138434077 +0000 UTC m=+1046.221017705 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs") pod "openstack-operator-controller-manager-684bbdfff8-7nr7w" (UID: "19a22b87-c6f3-4020-aa11-2a940041f49c") : secret "webhook-server-cert" not found Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.138579 4580 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.138635 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs podName:19a22b87-c6f3-4020-aa11-2a940041f49c nodeName:}" failed. No retries permitted until 2026-03-21 05:09:01.138617492 +0000 UTC m=+1046.221201120 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs") pod "openstack-operator-controller-manager-684bbdfff8-7nr7w" (UID: "19a22b87-c6f3-4020-aa11-2a940041f49c") : secret "metrics-server-cert" not found Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.138905 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.240647 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-nd42d\" (UID: \"b56378b1-33b0-4032-a383-49163ca1811d\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.240884 4580 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.241051 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert podName:b56378b1-33b0-4032-a383-49163ca1811d nodeName:}" failed. No retries permitted until 2026-03-21 05:09:02.241023831 +0000 UTC m=+1047.323607459 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert") pod "infra-operator-controller-manager-5595c7d6ff-nd42d" (UID: "b56378b1-33b0-4032-a383-49163ca1811d") : secret "infra-operator-webhook-server-cert" not found Mar 21 05:09:00 crc kubenswrapper[4580]: W0321 05:09:00.331896 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4faee52b_73ab_41d7_a319_33eb67e1aa30.slice/crio-75c4281ac8b48f475c0e5951ccd23a01daf182dc3f11b2fda64e22f3fa5ea2fd WatchSource:0}: Error finding container 75c4281ac8b48f475c0e5951ccd23a01daf182dc3f11b2fda64e22f3fa5ea2fd: Status 404 returned error can't find the container with id 75c4281ac8b48f475c0e5951ccd23a01daf182dc3f11b2fda64e22f3fa5ea2fd Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.334493 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-kw6px"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.364359 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-mn29l"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.378080 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-kmqjb"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.426697 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-lkjq4"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.519613 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-lk8d6"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.537516 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-lkjq4" event={"ID":"a96026f1-4dcb-483a-83da-aecc72e7590c","Type":"ContainerStarted","Data":"b439f09d79f1a90e87fbd2109edd75324c5b44eef5c497912ae66feac5f2cb3d"} Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.539815 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-kw6px" event={"ID":"4faee52b-73ab-41d7-a319-33eb67e1aa30","Type":"ContainerStarted","Data":"75c4281ac8b48f475c0e5951ccd23a01daf182dc3f11b2fda64e22f3fa5ea2fd"} Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.545425 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7spcb" event={"ID":"97d28b1b-fab2-4a70-88e6-a9d956721966","Type":"ContainerDied","Data":"52c5fb9d0817669c329b20f525b6dc26279d729a42d903f858b60575a75aae0d"} Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.545472 4580 scope.go:117] "RemoveContainer" containerID="416bb501bce96a3ce01955abf410ba60e297d73666a5516fad437ec46ca0aa6d" Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.545653 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7spcb" Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.551945 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-kmqjb" event={"ID":"21248f61-caf9-4660-8299-3b10368fa8ad","Type":"ContainerStarted","Data":"ed3a1e28bb2b898e98bea23666c9bfff2e6c040328fee0e123b9e39f7f90e275"} Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.560115 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-gqlhn"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.560351 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-6h2nr" event={"ID":"d3c84591-dfcf-48e6-a022-25562660675e","Type":"ContainerStarted","Data":"d28a415a704d2f04f199b219eaae3f20d152b7cb2dc6185b0b30d8447595270a"} Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.571905 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-vxzhk"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.578070 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-mn29l" event={"ID":"d45ead43-2f4d-46fc-857f-7e6dbb3e08f6","Type":"ContainerStarted","Data":"ce184912456bcdc74a4b000d11888149b222f18be4d800a2be259a2b03d816a8"} Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.581983 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v\" (UID: \"2e366c15-abc4-4e05-9054-cd7828e00059\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.582254 4580 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.582434 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert podName:2e366c15-abc4-4e05-9054-cd7828e00059 nodeName:}" failed. No retries permitted until 2026-03-21 05:09:02.582416569 +0000 UTC m=+1047.665000197 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" (UID: "2e366c15-abc4-4e05-9054-cd7828e00059") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.585013 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-784c64596-vdvhl"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.593287 4580 scope.go:117] "RemoveContainer" containerID="3e8d95e65a0bd4395fcf7b1cc1928118888f24222f7d9f3105f1e4ac634ac3c8" Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.607124 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-kg5cb"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.621551 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-7dkgm"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.627372 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7spcb"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.651216 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7spcb"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.670577 4580 scope.go:117] "RemoveContainer" containerID="402a606a72ee541d3258ae8555d8ca72f394bfd8a48e9d215a9bbb933713c39b" Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.747646 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-2h95r"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.758520 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.771350 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-q8jxn"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.786111 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.798290 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5"] Mar 21 05:09:00 crc kubenswrapper[4580]: W0321 05:09:00.801529 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6080f6a7_a68f_447a_bedd_182cd69337b5.slice/crio-06c5bb32a5fcefe1eb67471597c85e01e0045e74072c5408388ebeec97bc7bd0 WatchSource:0}: Error finding container 06c5bb32a5fcefe1eb67471597c85e01e0045e74072c5408388ebeec97bc7bd0: Status 404 returned error can't find the container with id 06c5bb32a5fcefe1eb67471597c85e01e0045e74072c5408388ebeec97bc7bd0 Mar 21 05:09:00 crc kubenswrapper[4580]: W0321 05:09:00.804006 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92286fdb_b69e_4028_8a93_3517469a731c.slice/crio-81ab8ee9adabee452f0b5da82832a69b36af753586abcf50ac1c1a2574637ca3 WatchSource:0}: Error finding container 81ab8ee9adabee452f0b5da82832a69b36af753586abcf50ac1c1a2574637ca3: Status 404 returned error can't find the container with id 81ab8ee9adabee452f0b5da82832a69b36af753586abcf50ac1c1a2574637ca3 Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.804976 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx"] Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.806576 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fqmkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-5mdkx_openstack-operators(92286fdb-b69e-4028-8a93-3517469a731c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.809355 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx" podUID="92286fdb-b69e-4028-8a93-3517469a731c" Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.811644 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll"] Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.818367 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs"] Mar 21 05:09:00 crc kubenswrapper[4580]: W0321 05:09:00.820248 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2535b22b_0bed_4ffd_9430_ca9fb3230c62.slice/crio-dd99bb35ab767bc7ca7cae72885acf6a4f11b4d61016f636d9290b4b08f3cef2 WatchSource:0}: Error finding container dd99bb35ab767bc7ca7cae72885acf6a4f11b4d61016f636d9290b4b08f3cef2: Status 404 returned error can't find the container with id dd99bb35ab767bc7ca7cae72885acf6a4f11b4d61016f636d9290b4b08f3cef2 Mar 21 05:09:00 crc kubenswrapper[4580]: W0321 05:09:00.821708 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a0c721d_68cd_46de_8292_6bd8373e1106.slice/crio-cce717d3751f9f8323204a4915d79b5ad5b4d7859178823572363f0f5d7e1a14 WatchSource:0}: Error finding container cce717d3751f9f8323204a4915d79b5ad5b4d7859178823572363f0f5d7e1a14: Status 404 returned error can't find the container with id cce717d3751f9f8323204a4915d79b5ad5b4d7859178823572363f0f5d7e1a14 Mar 21 05:09:00 crc kubenswrapper[4580]: I0321 05:09:00.822356 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v"] Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.823464 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lgctk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-bc5c78db9-d5jll_openstack-operators(2535b22b-0bed-4ffd-9430-ca9fb3230c62): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.824716 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll" podUID="2535b22b-0bed-4ffd-9430-ca9fb3230c62" Mar 21 05:09:00 crc kubenswrapper[4580]: W0321 05:09:00.827625 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e31c4f0_9b9d_4c10_84de_d15718775f9a.slice/crio-00c1468854cbeeb08fd56c2dd2bb78e09c590e233f2fea205d21ee28d2441680 WatchSource:0}: Error finding container 00c1468854cbeeb08fd56c2dd2bb78e09c590e233f2fea205d21ee28d2441680: Status 404 returned error can't find the container with id 00c1468854cbeeb08fd56c2dd2bb78e09c590e233f2fea205d21ee28d2441680 Mar 21 05:09:00 crc kubenswrapper[4580]: W0321 05:09:00.828746 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f7dea10_53e8_4c25_87bc_ffd154d4cb7d.slice/crio-a7c41ac51051ac6ee9ee0f9c296ecbe7c6fde1ea3bd6c250b4229ca25d0e495a WatchSource:0}: Error finding container a7c41ac51051ac6ee9ee0f9c296ecbe7c6fde1ea3bd6c250b4229ca25d0e495a: Status 404 returned error can't find the container with id a7c41ac51051ac6ee9ee0f9c296ecbe7c6fde1ea3bd6c250b4229ca25d0e495a Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.831666 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:6c837f09c0f3246b28931fcd0758f667ca596999558d025e06fc7b7611edec1a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p5rzs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-659fb58c6b-ln984_openstack-operators(0e31c4f0-9b9d-4c10-84de-d15718775f9a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.832057 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:88f2db101f619563231cfa13f4488596637731f0ebe33c661d4a5e48a86dd3e8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6ghx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6d84559f47-g52cx_openstack-operators(2a0c721d-68cd-46de-8292-6bd8373e1106): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.832321 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:45611eb2b721d1e59ac25f7308fb063e561e8dd81a5824ec5d3952eb066b63f6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rptqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6744dd545c-sdpcs_openstack-operators(6f7dea10-53e8-4c25-87bc-ffd154d4cb7d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.833462 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs" podUID="6f7dea10-53e8-4c25-87bc-ffd154d4cb7d" Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.833477 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984" podUID="0e31c4f0-9b9d-4c10-84de-d15718775f9a" Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.833515 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx" podUID="2a0c721d-68cd-46de-8292-6bd8373e1106" Mar 21 05:09:00 crc kubenswrapper[4580]: W0321 05:09:00.838934 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5992ccfd_4585_49b2_84ff_3f1fe6812a82.slice/crio-503f114ba5065c88b72a58d0b0df10d64c257bec5f4e826b99c55c392ffe35cb WatchSource:0}: Error finding container 503f114ba5065c88b72a58d0b0df10d64c257bec5f4e826b99c55c392ffe35cb: Status 404 returned error can't find the container with id 503f114ba5065c88b72a58d0b0df10d64c257bec5f4e826b99c55c392ffe35cb Mar 21 05:09:00 crc kubenswrapper[4580]: W0321 05:09:00.841648 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b025526_696f_4d7d_82ed_df03050fa1fd.slice/crio-c6613ed225039dd227ffae3cd70b0fde032c10143cc4bd7d95c292dc11b98c8f WatchSource:0}: Error finding container c6613ed225039dd227ffae3cd70b0fde032c10143cc4bd7d95c292dc11b98c8f: Status 404 returned error can't find the container with id c6613ed225039dd227ffae3cd70b0fde032c10143cc4bd7d95c292dc11b98c8f Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.844378 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zp56x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-8467ccb4c8-rqw4v_openstack-operators(5992ccfd-4585-49b2-84ff-3f1fe6812a82): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.845805 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v" podUID="5992ccfd-4585-49b2-84ff-3f1fe6812a82" Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.849380 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:0e0d48e3ca53577e20c81a87f0be6b3254c0b8418e3b446b68c8b5849af7213e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cxzwh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-867f54bc44-78zl5_openstack-operators(8b025526-696f-4d7d-82ed-df03050fa1fd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 05:09:00 crc kubenswrapper[4580]: E0321 05:09:00.851124 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5" podUID="8b025526-696f-4d7d-82ed-df03050fa1fd" Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.191557 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.191995 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:01 crc kubenswrapper[4580]: E0321 05:09:01.191745 4580 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 05:09:01 crc kubenswrapper[4580]: E0321 05:09:01.192137 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs podName:19a22b87-c6f3-4020-aa11-2a940041f49c nodeName:}" failed. No retries permitted until 2026-03-21 05:09:03.192097003 +0000 UTC m=+1048.274680631 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs") pod "openstack-operator-controller-manager-684bbdfff8-7nr7w" (UID: "19a22b87-c6f3-4020-aa11-2a940041f49c") : secret "webhook-server-cert" not found Mar 21 05:09:01 crc kubenswrapper[4580]: E0321 05:09:01.192208 4580 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 05:09:01 crc kubenswrapper[4580]: E0321 05:09:01.192280 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs podName:19a22b87-c6f3-4020-aa11-2a940041f49c nodeName:}" failed. No retries permitted until 2026-03-21 05:09:03.192258437 +0000 UTC m=+1048.274842125 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs") pod "openstack-operator-controller-manager-684bbdfff8-7nr7w" (UID: "19a22b87-c6f3-4020-aa11-2a940041f49c") : secret "metrics-server-cert" not found Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.596131 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-q8jxn" event={"ID":"6080f6a7-a68f-447a-bedd-182cd69337b5","Type":"ContainerStarted","Data":"06c5bb32a5fcefe1eb67471597c85e01e0045e74072c5408388ebeec97bc7bd0"} Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.602653 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v" event={"ID":"5992ccfd-4585-49b2-84ff-3f1fe6812a82","Type":"ContainerStarted","Data":"503f114ba5065c88b72a58d0b0df10d64c257bec5f4e826b99c55c392ffe35cb"} Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.605424 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5" event={"ID":"8b025526-696f-4d7d-82ed-df03050fa1fd","Type":"ContainerStarted","Data":"c6613ed225039dd227ffae3cd70b0fde032c10143cc4bd7d95c292dc11b98c8f"} Mar 21 05:09:01 crc kubenswrapper[4580]: E0321 05:09:01.605916 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v" podUID="5992ccfd-4585-49b2-84ff-3f1fe6812a82" Mar 21 05:09:01 crc kubenswrapper[4580]: E0321 05:09:01.607352 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:0e0d48e3ca53577e20c81a87f0be6b3254c0b8418e3b446b68c8b5849af7213e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5" podUID="8b025526-696f-4d7d-82ed-df03050fa1fd" Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.607868 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx" event={"ID":"92286fdb-b69e-4028-8a93-3517469a731c","Type":"ContainerStarted","Data":"81ab8ee9adabee452f0b5da82832a69b36af753586abcf50ac1c1a2574637ca3"} Mar 21 05:09:01 crc kubenswrapper[4580]: E0321 05:09:01.610094 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx" podUID="92286fdb-b69e-4028-8a93-3517469a731c" Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.611552 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx" event={"ID":"2a0c721d-68cd-46de-8292-6bd8373e1106","Type":"ContainerStarted","Data":"cce717d3751f9f8323204a4915d79b5ad5b4d7859178823572363f0f5d7e1a14"} Mar 21 05:09:01 crc kubenswrapper[4580]: E0321 05:09:01.614496 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:88f2db101f619563231cfa13f4488596637731f0ebe33c661d4a5e48a86dd3e8\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx" podUID="2a0c721d-68cd-46de-8292-6bd8373e1106" Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.615498 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-vxzhk" event={"ID":"5522a0a6-b385-4bf6-990c-5a07561257b0","Type":"ContainerStarted","Data":"014cad5f50c2ae77fdb036246e379877e9b5d0369212bf48d8415d0d079330b3"} Mar 21 05:09:01 crc kubenswrapper[4580]: E0321 05:09:01.639429 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552\\\"\"" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll" podUID="2535b22b-0bed-4ffd-9430-ca9fb3230c62" Mar 21 05:09:01 crc kubenswrapper[4580]: E0321 05:09:01.640616 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:6c837f09c0f3246b28931fcd0758f667ca596999558d025e06fc7b7611edec1a\\\"\"" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984" podUID="0e31c4f0-9b9d-4c10-84de-d15718775f9a" Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.662595 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97d28b1b-fab2-4a70-88e6-a9d956721966" path="/var/lib/kubelet/pods/97d28b1b-fab2-4a70-88e6-a9d956721966/volumes" Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.665045 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll" event={"ID":"2535b22b-0bed-4ffd-9430-ca9fb3230c62","Type":"ContainerStarted","Data":"dd99bb35ab767bc7ca7cae72885acf6a4f11b4d61016f636d9290b4b08f3cef2"} Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.665122 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984" event={"ID":"0e31c4f0-9b9d-4c10-84de-d15718775f9a","Type":"ContainerStarted","Data":"00c1468854cbeeb08fd56c2dd2bb78e09c590e233f2fea205d21ee28d2441680"} Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.665145 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs" event={"ID":"6f7dea10-53e8-4c25-87bc-ffd154d4cb7d","Type":"ContainerStarted","Data":"a7c41ac51051ac6ee9ee0f9c296ecbe7c6fde1ea3bd6c250b4229ca25d0e495a"} Mar 21 05:09:01 crc kubenswrapper[4580]: E0321 05:09:01.667344 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:45611eb2b721d1e59ac25f7308fb063e561e8dd81a5824ec5d3952eb066b63f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs" podUID="6f7dea10-53e8-4c25-87bc-ffd154d4cb7d" Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.672751 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-gqlhn" event={"ID":"fad28507-ca7b-4452-b392-f0b68e1f9d64","Type":"ContainerStarted","Data":"1a656c7c542603fc0098b5f8c5493469dd202db77dda9528319ce68aef9bdaa9"} Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.687003 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-2h95r" event={"ID":"67d1d125-57c7-4c30-a51a-24db28fb4818","Type":"ContainerStarted","Data":"4dc79b37722af23387622ade60b9ce423ca42e4a03165d946d5f87e01e5d8960"} Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.691890 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-kg5cb" event={"ID":"bd6a540f-0a1b-4098-8573-b9049d52f49b","Type":"ContainerStarted","Data":"13c2500be994cf9ab1ae004553879ab0825f49d8b57f08d3da55fa1184812789"} Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.696093 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-lk8d6" event={"ID":"127bc03d-748e-4919-97f8-6f66ab3e2a8a","Type":"ContainerStarted","Data":"fd874529168043ed32054bcfa97fdc718a3c09dfc3b0f16b3cb71864b50eea8a"} Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.703520 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-784c64596-vdvhl" event={"ID":"02dbd40b-11b9-4fca-9617-72b7be489626","Type":"ContainerStarted","Data":"35b852974474192aa9b480ca73651b42ebc41842a77b8a3d3dbc5f1c655970ae"} Mar 21 05:09:01 crc kubenswrapper[4580]: I0321 05:09:01.710515 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-7dkgm" event={"ID":"a216d106-9a69-4143-8766-4e505f2b5a8f","Type":"ContainerStarted","Data":"9c046043f1829568efcb154fc2c971eea5772b542aa2d48283462bfc6842aaee"} Mar 21 05:09:02 crc kubenswrapper[4580]: I0321 05:09:02.315438 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-nd42d\" (UID: \"b56378b1-33b0-4032-a383-49163ca1811d\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:09:02 crc kubenswrapper[4580]: E0321 05:09:02.315654 4580 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 05:09:02 crc kubenswrapper[4580]: E0321 05:09:02.315735 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert podName:b56378b1-33b0-4032-a383-49163ca1811d nodeName:}" failed. No retries permitted until 2026-03-21 05:09:06.315715538 +0000 UTC m=+1051.398299166 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert") pod "infra-operator-controller-manager-5595c7d6ff-nd42d" (UID: "b56378b1-33b0-4032-a383-49163ca1811d") : secret "infra-operator-webhook-server-cert" not found Mar 21 05:09:02 crc kubenswrapper[4580]: I0321 05:09:02.623688 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v\" (UID: \"2e366c15-abc4-4e05-9054-cd7828e00059\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:09:02 crc kubenswrapper[4580]: E0321 05:09:02.624104 4580 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:09:02 crc kubenswrapper[4580]: E0321 05:09:02.624178 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert podName:2e366c15-abc4-4e05-9054-cd7828e00059 nodeName:}" failed. No retries permitted until 2026-03-21 05:09:06.624154555 +0000 UTC m=+1051.706738183 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" (UID: "2e366c15-abc4-4e05-9054-cd7828e00059") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:09:02 crc kubenswrapper[4580]: E0321 05:09:02.724849 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:88f2db101f619563231cfa13f4488596637731f0ebe33c661d4a5e48a86dd3e8\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx" podUID="2a0c721d-68cd-46de-8292-6bd8373e1106" Mar 21 05:09:02 crc kubenswrapper[4580]: E0321 05:09:02.725250 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552\\\"\"" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll" podUID="2535b22b-0bed-4ffd-9430-ca9fb3230c62" Mar 21 05:09:02 crc kubenswrapper[4580]: E0321 05:09:02.725337 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:45611eb2b721d1e59ac25f7308fb063e561e8dd81a5824ec5d3952eb066b63f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs" podUID="6f7dea10-53e8-4c25-87bc-ffd154d4cb7d" Mar 21 05:09:02 crc kubenswrapper[4580]: E0321 05:09:02.725701 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx" podUID="92286fdb-b69e-4028-8a93-3517469a731c" Mar 21 05:09:02 crc kubenswrapper[4580]: E0321 05:09:02.730282 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:6c837f09c0f3246b28931fcd0758f667ca596999558d025e06fc7b7611edec1a\\\"\"" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984" podUID="0e31c4f0-9b9d-4c10-84de-d15718775f9a" Mar 21 05:09:02 crc kubenswrapper[4580]: E0321 05:09:02.733123 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:0e0d48e3ca53577e20c81a87f0be6b3254c0b8418e3b446b68c8b5849af7213e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5" podUID="8b025526-696f-4d7d-82ed-df03050fa1fd" Mar 21 05:09:02 crc kubenswrapper[4580]: E0321 05:09:02.733991 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v" podUID="5992ccfd-4585-49b2-84ff-3f1fe6812a82" Mar 21 05:09:03 crc kubenswrapper[4580]: E0321 05:09:03.233525 4580 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 05:09:03 crc kubenswrapper[4580]: E0321 05:09:03.233856 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs podName:19a22b87-c6f3-4020-aa11-2a940041f49c nodeName:}" failed. No retries permitted until 2026-03-21 05:09:07.233839489 +0000 UTC m=+1052.316423117 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs") pod "openstack-operator-controller-manager-684bbdfff8-7nr7w" (UID: "19a22b87-c6f3-4020-aa11-2a940041f49c") : secret "metrics-server-cert" not found Mar 21 05:09:03 crc kubenswrapper[4580]: I0321 05:09:03.233555 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:03 crc kubenswrapper[4580]: I0321 05:09:03.234472 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:03 crc kubenswrapper[4580]: E0321 05:09:03.234631 4580 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 05:09:03 crc kubenswrapper[4580]: E0321 05:09:03.234731 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs podName:19a22b87-c6f3-4020-aa11-2a940041f49c nodeName:}" failed. No retries permitted until 2026-03-21 05:09:07.234718372 +0000 UTC m=+1052.317302000 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs") pod "openstack-operator-controller-manager-684bbdfff8-7nr7w" (UID: "19a22b87-c6f3-4020-aa11-2a940041f49c") : secret "webhook-server-cert" not found Mar 21 05:09:06 crc kubenswrapper[4580]: I0321 05:09:06.384809 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-nd42d\" (UID: \"b56378b1-33b0-4032-a383-49163ca1811d\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:09:06 crc kubenswrapper[4580]: E0321 05:09:06.385037 4580 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 05:09:06 crc kubenswrapper[4580]: E0321 05:09:06.385116 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert podName:b56378b1-33b0-4032-a383-49163ca1811d nodeName:}" failed. No retries permitted until 2026-03-21 05:09:14.385098228 +0000 UTC m=+1059.467681856 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert") pod "infra-operator-controller-manager-5595c7d6ff-nd42d" (UID: "b56378b1-33b0-4032-a383-49163ca1811d") : secret "infra-operator-webhook-server-cert" not found Mar 21 05:09:06 crc kubenswrapper[4580]: I0321 05:09:06.691179 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v\" (UID: \"2e366c15-abc4-4e05-9054-cd7828e00059\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:09:06 crc kubenswrapper[4580]: E0321 05:09:06.691340 4580 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:09:06 crc kubenswrapper[4580]: E0321 05:09:06.691382 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert podName:2e366c15-abc4-4e05-9054-cd7828e00059 nodeName:}" failed. No retries permitted until 2026-03-21 05:09:14.691369357 +0000 UTC m=+1059.773952985 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" (UID: "2e366c15-abc4-4e05-9054-cd7828e00059") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:09:07 crc kubenswrapper[4580]: I0321 05:09:07.298707 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:07 crc kubenswrapper[4580]: I0321 05:09:07.299047 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:07 crc kubenswrapper[4580]: E0321 05:09:07.298906 4580 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 05:09:07 crc kubenswrapper[4580]: E0321 05:09:07.299142 4580 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 05:09:07 crc kubenswrapper[4580]: E0321 05:09:07.299172 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs podName:19a22b87-c6f3-4020-aa11-2a940041f49c nodeName:}" failed. No retries permitted until 2026-03-21 05:09:15.299154331 +0000 UTC m=+1060.381737969 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs") pod "openstack-operator-controller-manager-684bbdfff8-7nr7w" (UID: "19a22b87-c6f3-4020-aa11-2a940041f49c") : secret "webhook-server-cert" not found Mar 21 05:09:07 crc kubenswrapper[4580]: E0321 05:09:07.299192 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs podName:19a22b87-c6f3-4020-aa11-2a940041f49c nodeName:}" failed. No retries permitted until 2026-03-21 05:09:15.299183552 +0000 UTC m=+1060.381767180 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs") pod "openstack-operator-controller-manager-684bbdfff8-7nr7w" (UID: "19a22b87-c6f3-4020-aa11-2a940041f49c") : secret "metrics-server-cert" not found Mar 21 05:09:14 crc kubenswrapper[4580]: I0321 05:09:14.420642 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-nd42d\" (UID: \"b56378b1-33b0-4032-a383-49163ca1811d\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:09:14 crc kubenswrapper[4580]: E0321 05:09:14.420768 4580 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 05:09:14 crc kubenswrapper[4580]: E0321 05:09:14.421122 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert podName:b56378b1-33b0-4032-a383-49163ca1811d nodeName:}" failed. No retries permitted until 2026-03-21 05:09:30.421105059 +0000 UTC m=+1075.503688687 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert") pod "infra-operator-controller-manager-5595c7d6ff-nd42d" (UID: "b56378b1-33b0-4032-a383-49163ca1811d") : secret "infra-operator-webhook-server-cert" not found Mar 21 05:09:14 crc kubenswrapper[4580]: I0321 05:09:14.725386 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v\" (UID: \"2e366c15-abc4-4e05-9054-cd7828e00059\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:09:14 crc kubenswrapper[4580]: E0321 05:09:14.725993 4580 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:09:14 crc kubenswrapper[4580]: E0321 05:09:14.726104 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert podName:2e366c15-abc4-4e05-9054-cd7828e00059 nodeName:}" failed. No retries permitted until 2026-03-21 05:09:30.726082004 +0000 UTC m=+1075.808665642 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" (UID: "2e366c15-abc4-4e05-9054-cd7828e00059") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:09:14 crc kubenswrapper[4580]: E0321 05:09:14.788093 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:d9c086e2bb020a35ce4b7e4943a198808181e8f40785ab588e7904999e82885a" Mar 21 05:09:14 crc kubenswrapper[4580]: E0321 05:09:14.788299 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:d9c086e2bb020a35ce4b7e4943a198808181e8f40785ab588e7904999e82885a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fldrb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-fbf7bbb96-mn29l_openstack-operators(d45ead43-2f4d-46fc-857f-7e6dbb3e08f6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:09:14 crc kubenswrapper[4580]: E0321 05:09:14.789648 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-mn29l" podUID="d45ead43-2f4d-46fc-857f-7e6dbb3e08f6" Mar 21 05:09:14 crc kubenswrapper[4580]: E0321 05:09:14.826688 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d9c086e2bb020a35ce4b7e4943a198808181e8f40785ab588e7904999e82885a\\\"\"" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-mn29l" podUID="d45ead43-2f4d-46fc-857f-7e6dbb3e08f6" Mar 21 05:09:15 crc kubenswrapper[4580]: I0321 05:09:15.336645 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:15 crc kubenswrapper[4580]: I0321 05:09:15.336712 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:15 crc kubenswrapper[4580]: E0321 05:09:15.336874 4580 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 05:09:15 crc kubenswrapper[4580]: E0321 05:09:15.336922 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs podName:19a22b87-c6f3-4020-aa11-2a940041f49c nodeName:}" failed. No retries permitted until 2026-03-21 05:09:31.336907719 +0000 UTC m=+1076.419491347 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs") pod "openstack-operator-controller-manager-684bbdfff8-7nr7w" (UID: "19a22b87-c6f3-4020-aa11-2a940041f49c") : secret "metrics-server-cert" not found Mar 21 05:09:15 crc kubenswrapper[4580]: E0321 05:09:15.337283 4580 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 05:09:15 crc kubenswrapper[4580]: E0321 05:09:15.337314 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs podName:19a22b87-c6f3-4020-aa11-2a940041f49c nodeName:}" failed. No retries permitted until 2026-03-21 05:09:31.337307019 +0000 UTC m=+1076.419890647 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs") pod "openstack-operator-controller-manager-684bbdfff8-7nr7w" (UID: "19a22b87-c6f3-4020-aa11-2a940041f49c") : secret "webhook-server-cert" not found Mar 21 05:09:15 crc kubenswrapper[4580]: E0321 05:09:15.394476 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:f8c37859d1268fabc21e3d910b2484bda7254f22b50b9df0696f8f5d72a0abf4" Mar 21 05:09:15 crc kubenswrapper[4580]: E0321 05:09:15.394676 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:f8c37859d1268fabc21e3d910b2484bda7254f22b50b9df0696f8f5d72a0abf4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s2vf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-64dc66d669-lk8d6_openstack-operators(127bc03d-748e-4919-97f8-6f66ab3e2a8a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:09:15 crc kubenswrapper[4580]: E0321 05:09:15.397328 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-lk8d6" podUID="127bc03d-748e-4919-97f8-6f66ab3e2a8a" Mar 21 05:09:15 crc kubenswrapper[4580]: E0321 05:09:15.835962 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:f8c37859d1268fabc21e3d910b2484bda7254f22b50b9df0696f8f5d72a0abf4\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-lk8d6" podUID="127bc03d-748e-4919-97f8-6f66ab3e2a8a" Mar 21 05:09:15 crc kubenswrapper[4580]: E0321 05:09:15.988160 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:8cff216ce54922d6d182d9f5dcd0d6bc51d6560e808319c7e20487ee7b6474d1" Mar 21 05:09:15 crc kubenswrapper[4580]: E0321 05:09:15.988351 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:8cff216ce54922d6d182d9f5dcd0d6bc51d6560e808319c7e20487ee7b6474d1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vjq94,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-66dd9d474d-kw6px_openstack-operators(4faee52b-73ab-41d7-a319-33eb67e1aa30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:09:15 crc kubenswrapper[4580]: E0321 05:09:15.989513 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-kw6px" podUID="4faee52b-73ab-41d7-a319-33eb67e1aa30" Mar 21 05:09:16 crc kubenswrapper[4580]: E0321 05:09:16.775147 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:b0ba0389a96140174eaad4ad8cc3e98118472d640bdca18046877e973f009ff4" Mar 21 05:09:16 crc kubenswrapper[4580]: E0321 05:09:16.775372 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:b0ba0389a96140174eaad4ad8cc3e98118472d640bdca18046877e973f009ff4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xqj75,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-56f74467c6-2h95r_openstack-operators(67d1d125-57c7-4c30-a51a-24db28fb4818): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:09:16 crc kubenswrapper[4580]: E0321 05:09:16.776584 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-2h95r" podUID="67d1d125-57c7-4c30-a51a-24db28fb4818" Mar 21 05:09:16 crc kubenswrapper[4580]: E0321 05:09:16.841849 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:b0ba0389a96140174eaad4ad8cc3e98118472d640bdca18046877e973f009ff4\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-2h95r" podUID="67d1d125-57c7-4c30-a51a-24db28fb4818" Mar 21 05:09:16 crc kubenswrapper[4580]: E0321 05:09:16.842071 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:8cff216ce54922d6d182d9f5dcd0d6bc51d6560e808319c7e20487ee7b6474d1\\\"\"" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-kw6px" podUID="4faee52b-73ab-41d7-a319-33eb67e1aa30" Mar 21 05:09:17 crc kubenswrapper[4580]: E0321 05:09:17.367305 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:4f487da837018bfcd11dd794ba8f4dacc839b92e0d060c146fd1f771d750abf8" Mar 21 05:09:17 crc kubenswrapper[4580]: E0321 05:09:17.367958 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:4f487da837018bfcd11dd794ba8f4dacc839b92e0d060c146fd1f771d750abf8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qgfsz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6f5b7bcd4-7dkgm_openstack-operators(a216d106-9a69-4143-8766-4e505f2b5a8f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:09:17 crc kubenswrapper[4580]: E0321 05:09:17.369228 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-7dkgm" podUID="a216d106-9a69-4143-8766-4e505f2b5a8f" Mar 21 05:09:17 crc kubenswrapper[4580]: E0321 05:09:17.848401 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:4f487da837018bfcd11dd794ba8f4dacc839b92e0d060c146fd1f771d750abf8\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-7dkgm" podUID="a216d106-9a69-4143-8766-4e505f2b5a8f" Mar 21 05:09:22 crc kubenswrapper[4580]: E0321 05:09:22.471539 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.213:5001/openstack-k8s-operators/keystone-operator:aa56ed93426574ec9e667fbabd364d97425571a9" Mar 21 05:09:22 crc kubenswrapper[4580]: E0321 05:09:22.471911 4580 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.213:5001/openstack-k8s-operators/keystone-operator:aa56ed93426574ec9e667fbabd364d97425571a9" Mar 21 05:09:22 crc kubenswrapper[4580]: E0321 05:09:22.472102 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.213:5001/openstack-k8s-operators/keystone-operator:aa56ed93426574ec9e667fbabd364d97425571a9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tcgfw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-784c64596-vdvhl_openstack-operators(02dbd40b-11b9-4fca-9617-72b7be489626): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:09:22 crc kubenswrapper[4580]: E0321 05:09:22.473366 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-784c64596-vdvhl" podUID="02dbd40b-11b9-4fca-9617-72b7be489626" Mar 21 05:09:22 crc kubenswrapper[4580]: E0321 05:09:22.893571 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.213:5001/openstack-k8s-operators/keystone-operator:aa56ed93426574ec9e667fbabd364d97425571a9\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-784c64596-vdvhl" podUID="02dbd40b-11b9-4fca-9617-72b7be489626" Mar 21 05:09:30 crc kubenswrapper[4580]: I0321 05:09:30.469282 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-nd42d\" (UID: \"b56378b1-33b0-4032-a383-49163ca1811d\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:09:30 crc kubenswrapper[4580]: I0321 05:09:30.475319 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b56378b1-33b0-4032-a383-49163ca1811d-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-nd42d\" (UID: \"b56378b1-33b0-4032-a383-49163ca1811d\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:09:30 crc kubenswrapper[4580]: I0321 05:09:30.709832 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:09:30 crc kubenswrapper[4580]: I0321 05:09:30.775353 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v\" (UID: \"2e366c15-abc4-4e05-9054-cd7828e00059\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:09:30 crc kubenswrapper[4580]: I0321 05:09:30.781424 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e366c15-abc4-4e05-9054-cd7828e00059-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v\" (UID: \"2e366c15-abc4-4e05-9054-cd7828e00059\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:09:30 crc kubenswrapper[4580]: I0321 05:09:30.949771 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:09:31 crc kubenswrapper[4580]: E0321 05:09:31.112255 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552" Mar 21 05:09:31 crc kubenswrapper[4580]: E0321 05:09:31.112482 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lgctk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-bc5c78db9-d5jll_openstack-operators(2535b22b-0bed-4ffd-9430-ca9fb3230c62): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:09:31 crc kubenswrapper[4580]: E0321 05:09:31.114951 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll" podUID="2535b22b-0bed-4ffd-9430-ca9fb3230c62" Mar 21 05:09:31 crc kubenswrapper[4580]: I0321 05:09:31.384822 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:31 crc kubenswrapper[4580]: I0321 05:09:31.384891 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:31 crc kubenswrapper[4580]: I0321 05:09:31.389044 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-metrics-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:31 crc kubenswrapper[4580]: I0321 05:09:31.390118 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19a22b87-c6f3-4020-aa11-2a940041f49c-webhook-certs\") pod \"openstack-operator-controller-manager-684bbdfff8-7nr7w\" (UID: \"19a22b87-c6f3-4020-aa11-2a940041f49c\") " pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:31 crc kubenswrapper[4580]: I0321 05:09:31.464727 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:31 crc kubenswrapper[4580]: E0321 05:09:31.685699 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 21 05:09:31 crc kubenswrapper[4580]: E0321 05:09:31.686395 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fqmkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-5mdkx_openstack-operators(92286fdb-b69e-4028-8a93-3517469a731c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:09:31 crc kubenswrapper[4580]: E0321 05:09:31.687594 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx" podUID="92286fdb-b69e-4028-8a93-3517469a731c" Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.541243 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v"] Mar 21 05:09:32 crc kubenswrapper[4580]: W0321 05:09:32.569979 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e366c15_abc4_4e05_9054_cd7828e00059.slice/crio-1aad22519952b019eea7ee889716fd4e0e61c8dcb372c2ce98036a2cada74746 WatchSource:0}: Error finding container 1aad22519952b019eea7ee889716fd4e0e61c8dcb372c2ce98036a2cada74746: Status 404 returned error can't find the container with id 1aad22519952b019eea7ee889716fd4e0e61c8dcb372c2ce98036a2cada74746 Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.610287 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d"] Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.725172 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w"] Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.965385 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-kg5cb" event={"ID":"bd6a540f-0a1b-4098-8573-b9049d52f49b","Type":"ContainerStarted","Data":"707ceda8427bf0bf3f5f4f43a4a545cf65016f6023499ea04abe7a6ce162b24a"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.965466 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-kg5cb" Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.966384 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" event={"ID":"b56378b1-33b0-4032-a383-49163ca1811d","Type":"ContainerStarted","Data":"cedcad301a90c19847972383face8bf974b241cebb1a2d0af71271e66fca4dfe"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.967733 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-lk8d6" event={"ID":"127bc03d-748e-4919-97f8-6f66ab3e2a8a","Type":"ContainerStarted","Data":"aa716bd74fb6e4ceed4923f2eac5faf942cbcebb6b3f6dfcd3258084970c929a"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.967914 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-lk8d6" Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.969336 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-lkjq4" event={"ID":"a96026f1-4dcb-483a-83da-aecc72e7590c","Type":"ContainerStarted","Data":"24bb8b7ad4fb1ec4f1713d27ded886871096bae49a137d5f898838596a496991"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.969461 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-lkjq4" Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.971086 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-2h95r" event={"ID":"67d1d125-57c7-4c30-a51a-24db28fb4818","Type":"ContainerStarted","Data":"02e07854c4d379fcb5122f0ba8ca43aa4f2a5bf4c679555a9e942703bf326411"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.971239 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-2h95r" Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.972749 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-6h2nr" event={"ID":"d3c84591-dfcf-48e6-a022-25562660675e","Type":"ContainerStarted","Data":"3d63fc6ee66ded6c93d7884568bda72489f99aba88ef3716f7cff74bda46e994"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.972882 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-6h2nr" Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.974524 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-gqlhn" event={"ID":"fad28507-ca7b-4452-b392-f0b68e1f9d64","Type":"ContainerStarted","Data":"de32ffccaf041aad55db58bab22d7829227bc1911fd8dc2c07a86fb5fc221c76"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.974613 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-gqlhn" Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.975864 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-vxzhk" event={"ID":"5522a0a6-b385-4bf6-990c-5a07561257b0","Type":"ContainerStarted","Data":"2491873f527e223b7d30d201a6acf256d38261e77a2442a4a4df39d6c23e5311"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.976000 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-vxzhk" Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.977601 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" event={"ID":"2e366c15-abc4-4e05-9054-cd7828e00059","Type":"ContainerStarted","Data":"1aad22519952b019eea7ee889716fd4e0e61c8dcb372c2ce98036a2cada74746"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.981368 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx" event={"ID":"2a0c721d-68cd-46de-8292-6bd8373e1106","Type":"ContainerStarted","Data":"c655da14db25f7744e1ee1c09afc5eea95b5bae2d726cd1fe186369bedee76e1"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.981543 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx" Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.982809 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" event={"ID":"19a22b87-c6f3-4020-aa11-2a940041f49c","Type":"ContainerStarted","Data":"1abdeaf5a67202641f9cab33ee981e77affe6fecb3f4fc450979e27a339b3856"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.984086 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-q8jxn" event={"ID":"6080f6a7-a68f-447a-bedd-182cd69337b5","Type":"ContainerStarted","Data":"580f0a3ebcb73ea01bfb8fd11c2a0b69496813bcb297ceb141e24c8f5987f166"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.984230 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-q8jxn" Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.987950 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v" event={"ID":"5992ccfd-4585-49b2-84ff-3f1fe6812a82","Type":"ContainerStarted","Data":"2c62489f1c106d92cea964a9ad4ca3b0e23ac6531477b9fc25068bf68aedb498"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.988137 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v" Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.994878 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5" event={"ID":"8b025526-696f-4d7d-82ed-df03050fa1fd","Type":"ContainerStarted","Data":"9968e14dcd87ee4ebd14b4641983af243c3bd02f72a3ef603a5ad011a448536b"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.995093 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5" Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.997134 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-kmqjb" event={"ID":"21248f61-caf9-4660-8299-3b10368fa8ad","Type":"ContainerStarted","Data":"3846bb11d7d938d96ad2b75a867e23b84d8b40b7b2ca2c6238fb8567731fde8b"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.997810 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-kmqjb" Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.999176 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs" event={"ID":"6f7dea10-53e8-4c25-87bc-ffd154d4cb7d","Type":"ContainerStarted","Data":"5e0c252721bd3e87f0789792b7afff71a911c0ae181b1d2fd1fbbcdf05028eb9"} Mar 21 05:09:32 crc kubenswrapper[4580]: I0321 05:09:32.999968 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.010243 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-mn29l" event={"ID":"d45ead43-2f4d-46fc-857f-7e6dbb3e08f6","Type":"ContainerStarted","Data":"8a0c4ced3ebc5cef7c247c522d883a9c8f67d62353083c155650667d43dceda0"} Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.011044 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-mn29l" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.012267 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984" event={"ID":"0e31c4f0-9b9d-4c10-84de-d15718775f9a","Type":"ContainerStarted","Data":"9891b50500c61b64c5454998b1768a0d594db75196c5e47f803e2747c61af20c"} Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.012653 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.054132 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-kg5cb" podStartSLOduration=13.372189728 podStartE2EDuration="35.054115381s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.577481589 +0000 UTC m=+1045.660065217" lastFinishedPulling="2026-03-21 05:09:22.259407242 +0000 UTC m=+1067.341990870" observedRunningTime="2026-03-21 05:09:33.035111108 +0000 UTC m=+1078.117694756" watchObservedRunningTime="2026-03-21 05:09:33.054115381 +0000 UTC m=+1078.136699009" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.105325 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v" podStartSLOduration=4.281200556 podStartE2EDuration="35.105300114s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.844258944 +0000 UTC m=+1045.926842572" lastFinishedPulling="2026-03-21 05:09:31.668358502 +0000 UTC m=+1076.750942130" observedRunningTime="2026-03-21 05:09:33.098427093 +0000 UTC m=+1078.181010731" watchObservedRunningTime="2026-03-21 05:09:33.105300114 +0000 UTC m=+1078.187883742" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.107805 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-lkjq4" podStartSLOduration=15.037712844 podStartE2EDuration="35.10779602s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.474874075 +0000 UTC m=+1045.557457693" lastFinishedPulling="2026-03-21 05:09:20.544957241 +0000 UTC m=+1065.627540869" observedRunningTime="2026-03-21 05:09:33.053163645 +0000 UTC m=+1078.135747294" watchObservedRunningTime="2026-03-21 05:09:33.10779602 +0000 UTC m=+1078.190379648" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.155760 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984" podStartSLOduration=4.192216692 podStartE2EDuration="35.155737938s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.831549108 +0000 UTC m=+1045.914132746" lastFinishedPulling="2026-03-21 05:09:31.795070364 +0000 UTC m=+1076.877653992" observedRunningTime="2026-03-21 05:09:33.153255883 +0000 UTC m=+1078.235839531" watchObservedRunningTime="2026-03-21 05:09:33.155737938 +0000 UTC m=+1078.238321566" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.207411 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-lk8d6" podStartSLOduration=3.865227865 podStartE2EDuration="35.207376914s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.577480559 +0000 UTC m=+1045.660064197" lastFinishedPulling="2026-03-21 05:09:31.919629618 +0000 UTC m=+1077.002213246" observedRunningTime="2026-03-21 05:09:33.204156869 +0000 UTC m=+1078.286740497" watchObservedRunningTime="2026-03-21 05:09:33.207376914 +0000 UTC m=+1078.289960542" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.292770 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx" podStartSLOduration=5.689998463 podStartE2EDuration="35.292747442s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.829889534 +0000 UTC m=+1045.912473162" lastFinishedPulling="2026-03-21 05:09:30.432638513 +0000 UTC m=+1075.515222141" observedRunningTime="2026-03-21 05:09:33.267203176 +0000 UTC m=+1078.349786814" watchObservedRunningTime="2026-03-21 05:09:33.292747442 +0000 UTC m=+1078.375331060" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.387499 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5" podStartSLOduration=4.345930908 podStartE2EDuration="35.387484537s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.849276527 +0000 UTC m=+1045.931860145" lastFinishedPulling="2026-03-21 05:09:31.890830146 +0000 UTC m=+1076.973413774" observedRunningTime="2026-03-21 05:09:33.385920946 +0000 UTC m=+1078.468504584" watchObservedRunningTime="2026-03-21 05:09:33.387484537 +0000 UTC m=+1078.470068165" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.389934 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-kmqjb" podStartSLOduration=14.681833813 podStartE2EDuration="35.389927552s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.432142795 +0000 UTC m=+1045.514726423" lastFinishedPulling="2026-03-21 05:09:21.140236524 +0000 UTC m=+1066.222820162" observedRunningTime="2026-03-21 05:09:33.343075853 +0000 UTC m=+1078.425659491" watchObservedRunningTime="2026-03-21 05:09:33.389927552 +0000 UTC m=+1078.472511180" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.431175 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-6h2nr" podStartSLOduration=14.29831048 podStartE2EDuration="35.431158102s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.007427493 +0000 UTC m=+1045.090011121" lastFinishedPulling="2026-03-21 05:09:21.140275115 +0000 UTC m=+1066.222858743" observedRunningTime="2026-03-21 05:09:33.429364325 +0000 UTC m=+1078.511947963" watchObservedRunningTime="2026-03-21 05:09:33.431158102 +0000 UTC m=+1078.513741730" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.491670 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-gqlhn" podStartSLOduration=13.752009712 podStartE2EDuration="35.491653462s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.519761772 +0000 UTC m=+1045.602345400" lastFinishedPulling="2026-03-21 05:09:22.259405522 +0000 UTC m=+1067.341989150" observedRunningTime="2026-03-21 05:09:33.485633673 +0000 UTC m=+1078.568217301" watchObservedRunningTime="2026-03-21 05:09:33.491653462 +0000 UTC m=+1078.574237080" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.538153 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-mn29l" podStartSLOduration=3.934313292 podStartE2EDuration="35.538132421s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.374874281 +0000 UTC m=+1045.457457909" lastFinishedPulling="2026-03-21 05:09:31.97869341 +0000 UTC m=+1077.061277038" observedRunningTime="2026-03-21 05:09:33.530243162 +0000 UTC m=+1078.612826800" watchObservedRunningTime="2026-03-21 05:09:33.538132421 +0000 UTC m=+1078.620716049" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.579258 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs" podStartSLOduration=4.521206223 podStartE2EDuration="35.579218268s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.832230246 +0000 UTC m=+1045.914813874" lastFinishedPulling="2026-03-21 05:09:31.890242281 +0000 UTC m=+1076.972825919" observedRunningTime="2026-03-21 05:09:33.576470485 +0000 UTC m=+1078.659054123" watchObservedRunningTime="2026-03-21 05:09:33.579218268 +0000 UTC m=+1078.661801896" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.637599 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-q8jxn" podStartSLOduration=15.898848278 podStartE2EDuration="35.637579761s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.805003476 +0000 UTC m=+1045.887587104" lastFinishedPulling="2026-03-21 05:09:20.543734949 +0000 UTC m=+1065.626318587" observedRunningTime="2026-03-21 05:09:33.636122943 +0000 UTC m=+1078.718706591" watchObservedRunningTime="2026-03-21 05:09:33.637579761 +0000 UTC m=+1078.720163389" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.699191 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-2h95r" podStartSLOduration=4.591926684 podStartE2EDuration="35.69917329s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.805863589 +0000 UTC m=+1045.888447217" lastFinishedPulling="2026-03-21 05:09:31.913110195 +0000 UTC m=+1076.995693823" observedRunningTime="2026-03-21 05:09:33.694498296 +0000 UTC m=+1078.777081944" watchObservedRunningTime="2026-03-21 05:09:33.69917329 +0000 UTC m=+1078.781756918" Mar 21 05:09:33 crc kubenswrapper[4580]: I0321 05:09:33.725183 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-vxzhk" podStartSLOduration=15.714578205 podStartE2EDuration="35.725166857s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.533918177 +0000 UTC m=+1045.616501805" lastFinishedPulling="2026-03-21 05:09:20.544506839 +0000 UTC m=+1065.627090457" observedRunningTime="2026-03-21 05:09:33.724640964 +0000 UTC m=+1078.807224602" watchObservedRunningTime="2026-03-21 05:09:33.725166857 +0000 UTC m=+1078.807750485" Mar 21 05:09:34 crc kubenswrapper[4580]: I0321 05:09:34.027399 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-7dkgm" event={"ID":"a216d106-9a69-4143-8766-4e505f2b5a8f","Type":"ContainerStarted","Data":"d7f89f6cfbd8e9103442259df2624f6cd693c6cdb5347b38edad8d7b26fe80e2"} Mar 21 05:09:34 crc kubenswrapper[4580]: I0321 05:09:34.027949 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-7dkgm" Mar 21 05:09:34 crc kubenswrapper[4580]: I0321 05:09:34.029654 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" event={"ID":"19a22b87-c6f3-4020-aa11-2a940041f49c","Type":"ContainerStarted","Data":"05074ddbafb89ed3642475619803b4278d0991938844bd567fdfc391bfe1f6ad"} Mar 21 05:09:34 crc kubenswrapper[4580]: I0321 05:09:34.029687 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:34 crc kubenswrapper[4580]: I0321 05:09:34.031252 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-kw6px" event={"ID":"4faee52b-73ab-41d7-a319-33eb67e1aa30","Type":"ContainerStarted","Data":"9d2bfffb316ff96041cb779956de0da983a3f18c66f35b176f5aa8029f4805bc"} Mar 21 05:09:34 crc kubenswrapper[4580]: I0321 05:09:34.046041 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-7dkgm" podStartSLOduration=3.541808371 podStartE2EDuration="36.046025513s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.59304101 +0000 UTC m=+1045.675624638" lastFinishedPulling="2026-03-21 05:09:33.097258152 +0000 UTC m=+1078.179841780" observedRunningTime="2026-03-21 05:09:34.044439221 +0000 UTC m=+1079.127022849" watchObservedRunningTime="2026-03-21 05:09:34.046025513 +0000 UTC m=+1079.128609141" Mar 21 05:09:34 crc kubenswrapper[4580]: I0321 05:09:34.068888 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-kw6px" podStartSLOduration=4.370758694 podStartE2EDuration="36.068872367s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.333730472 +0000 UTC m=+1045.416314100" lastFinishedPulling="2026-03-21 05:09:32.031844145 +0000 UTC m=+1077.114427773" observedRunningTime="2026-03-21 05:09:34.063424763 +0000 UTC m=+1079.146008391" watchObservedRunningTime="2026-03-21 05:09:34.068872367 +0000 UTC m=+1079.151455995" Mar 21 05:09:34 crc kubenswrapper[4580]: I0321 05:09:34.109299 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" podStartSLOduration=35.109284386 podStartE2EDuration="35.109284386s" podCreationTimestamp="2026-03-21 05:08:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:09:34.10338661 +0000 UTC m=+1079.185970248" watchObservedRunningTime="2026-03-21 05:09:34.109284386 +0000 UTC m=+1079.191868014" Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.065443 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" event={"ID":"b56378b1-33b0-4032-a383-49163ca1811d","Type":"ContainerStarted","Data":"803a752f653b2f6f55a192cc9af5da542c85f88bf890ca6538ec058c306b0c1e"} Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.066077 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.067595 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-784c64596-vdvhl" event={"ID":"02dbd40b-11b9-4fca-9617-72b7be489626","Type":"ContainerStarted","Data":"e926b9bef633103948c6055e05a21d5adf3590b331d25c38b13d6a3d6af23a93"} Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.067764 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-784c64596-vdvhl" Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.069019 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" event={"ID":"2e366c15-abc4-4e05-9054-cd7828e00059","Type":"ContainerStarted","Data":"1ce15e246c672cd389d46027dc1a07a2a84fb3e40590306e394271953088edd4"} Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.069446 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.118293 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" podStartSLOduration=35.404047434 podStartE2EDuration="40.118272458s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:32.649322845 +0000 UTC m=+1077.731906473" lastFinishedPulling="2026-03-21 05:09:37.363547869 +0000 UTC m=+1082.446131497" observedRunningTime="2026-03-21 05:09:38.091194802 +0000 UTC m=+1083.173778440" watchObservedRunningTime="2026-03-21 05:09:38.118272458 +0000 UTC m=+1083.200856086" Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.139661 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" podStartSLOduration=35.34342051 podStartE2EDuration="40.139637773s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:32.573442708 +0000 UTC m=+1077.656026336" lastFinishedPulling="2026-03-21 05:09:37.369659971 +0000 UTC m=+1082.452243599" observedRunningTime="2026-03-21 05:09:38.120920858 +0000 UTC m=+1083.203504506" watchObservedRunningTime="2026-03-21 05:09:38.139637773 +0000 UTC m=+1083.222221401" Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.144050 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-784c64596-vdvhl" podStartSLOduration=3.050405194 podStartE2EDuration="40.144033619s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.577624492 +0000 UTC m=+1045.660208120" lastFinishedPulling="2026-03-21 05:09:37.671252917 +0000 UTC m=+1082.753836545" observedRunningTime="2026-03-21 05:09:38.136686335 +0000 UTC m=+1083.219269963" watchObservedRunningTime="2026-03-21 05:09:38.144033619 +0000 UTC m=+1083.226617247" Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.475806 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-6h2nr" Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.586725 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-gqlhn" Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.637473 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-kw6px" Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.641197 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-kw6px" Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.798500 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-vxzhk" Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.831594 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-lkjq4" Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.834227 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-lk8d6" Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.965000 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-kmqjb" Mar 21 05:09:38 crc kubenswrapper[4580]: I0321 05:09:38.995393 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-mn29l" Mar 21 05:09:39 crc kubenswrapper[4580]: I0321 05:09:39.003290 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-7dkgm" Mar 21 05:09:39 crc kubenswrapper[4580]: I0321 05:09:39.030610 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-sdpcs" Mar 21 05:09:39 crc kubenswrapper[4580]: I0321 05:09:39.116054 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-2h95r" Mar 21 05:09:39 crc kubenswrapper[4580]: I0321 05:09:39.201736 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-kg5cb" Mar 21 05:09:39 crc kubenswrapper[4580]: I0321 05:09:39.253448 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-ln984" Mar 21 05:09:39 crc kubenswrapper[4580]: I0321 05:09:39.605126 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-78zl5" Mar 21 05:09:39 crc kubenswrapper[4580]: I0321 05:09:39.630531 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-g52cx" Mar 21 05:09:39 crc kubenswrapper[4580]: I0321 05:09:39.642504 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-rqw4v" Mar 21 05:09:39 crc kubenswrapper[4580]: I0321 05:09:39.943827 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-q8jxn" Mar 21 05:09:41 crc kubenswrapper[4580]: I0321 05:09:41.471942 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-684bbdfff8-7nr7w" Mar 21 05:09:42 crc kubenswrapper[4580]: E0321 05:09:42.620262 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552\\\"\"" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll" podUID="2535b22b-0bed-4ffd-9430-ca9fb3230c62" Mar 21 05:09:44 crc kubenswrapper[4580]: E0321 05:09:44.619834 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx" podUID="92286fdb-b69e-4028-8a93-3517469a731c" Mar 21 05:09:48 crc kubenswrapper[4580]: I0321 05:09:48.977298 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-784c64596-vdvhl" Mar 21 05:09:50 crc kubenswrapper[4580]: I0321 05:09:50.716757 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-nd42d" Mar 21 05:09:50 crc kubenswrapper[4580]: I0321 05:09:50.957737 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v" Mar 21 05:09:58 crc kubenswrapper[4580]: I0321 05:09:58.213563 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll" event={"ID":"2535b22b-0bed-4ffd-9430-ca9fb3230c62","Type":"ContainerStarted","Data":"1b1f1da4245afd24d1ec67d70f02976329ccb415b5c6288a900e92133573b319"} Mar 21 05:09:58 crc kubenswrapper[4580]: I0321 05:09:58.214352 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll" Mar 21 05:09:58 crc kubenswrapper[4580]: I0321 05:09:58.235180 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll" podStartSLOduration=3.9960928300000003 podStartE2EDuration="1m0.235148612s" podCreationTimestamp="2026-03-21 05:08:58 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.823330551 +0000 UTC m=+1045.905914179" lastFinishedPulling="2026-03-21 05:09:57.062386343 +0000 UTC m=+1102.144969961" observedRunningTime="2026-03-21 05:09:58.226245125 +0000 UTC m=+1103.308828803" watchObservedRunningTime="2026-03-21 05:09:58.235148612 +0000 UTC m=+1103.317732250" Mar 21 05:09:59 crc kubenswrapper[4580]: I0321 05:09:59.220818 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx" event={"ID":"92286fdb-b69e-4028-8a93-3517469a731c","Type":"ContainerStarted","Data":"480bef93c3408c73cd60e86c7dd2a49805e56d3b8a10d34eb581c1f4002b9c60"} Mar 21 05:09:59 crc kubenswrapper[4580]: I0321 05:09:59.240870 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mdkx" podStartSLOduration=2.969204767 podStartE2EDuration="1m0.240849583s" podCreationTimestamp="2026-03-21 05:08:59 +0000 UTC" firstStartedPulling="2026-03-21 05:09:00.806468185 +0000 UTC m=+1045.889051803" lastFinishedPulling="2026-03-21 05:09:58.078112981 +0000 UTC m=+1103.160696619" observedRunningTime="2026-03-21 05:09:59.237091003 +0000 UTC m=+1104.319674631" watchObservedRunningTime="2026-03-21 05:09:59.240849583 +0000 UTC m=+1104.323433211" Mar 21 05:10:00 crc kubenswrapper[4580]: I0321 05:10:00.140571 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567830-lls59"] Mar 21 05:10:00 crc kubenswrapper[4580]: E0321 05:10:00.141046 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d28b1b-fab2-4a70-88e6-a9d956721966" containerName="extract-utilities" Mar 21 05:10:00 crc kubenswrapper[4580]: I0321 05:10:00.141069 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d28b1b-fab2-4a70-88e6-a9d956721966" containerName="extract-utilities" Mar 21 05:10:00 crc kubenswrapper[4580]: E0321 05:10:00.141089 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d28b1b-fab2-4a70-88e6-a9d956721966" containerName="registry-server" Mar 21 05:10:00 crc kubenswrapper[4580]: I0321 05:10:00.141097 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d28b1b-fab2-4a70-88e6-a9d956721966" containerName="registry-server" Mar 21 05:10:00 crc kubenswrapper[4580]: E0321 05:10:00.141120 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d28b1b-fab2-4a70-88e6-a9d956721966" containerName="extract-content" Mar 21 05:10:00 crc kubenswrapper[4580]: I0321 05:10:00.141127 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d28b1b-fab2-4a70-88e6-a9d956721966" containerName="extract-content" Mar 21 05:10:00 crc kubenswrapper[4580]: I0321 05:10:00.141371 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="97d28b1b-fab2-4a70-88e6-a9d956721966" containerName="registry-server" Mar 21 05:10:00 crc kubenswrapper[4580]: I0321 05:10:00.142069 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-lls59" Mar 21 05:10:00 crc kubenswrapper[4580]: I0321 05:10:00.144602 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:10:00 crc kubenswrapper[4580]: I0321 05:10:00.146499 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:10:00 crc kubenswrapper[4580]: I0321 05:10:00.149030 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-lls59"] Mar 21 05:10:00 crc kubenswrapper[4580]: I0321 05:10:00.154118 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:10:00 crc kubenswrapper[4580]: I0321 05:10:00.259123 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb5gl\" (UniqueName: \"kubernetes.io/projected/0d267a81-ba86-4ddd-b83b-37ae171c6230-kube-api-access-jb5gl\") pod \"auto-csr-approver-29567830-lls59\" (UID: \"0d267a81-ba86-4ddd-b83b-37ae171c6230\") " pod="openshift-infra/auto-csr-approver-29567830-lls59" Mar 21 05:10:00 crc kubenswrapper[4580]: I0321 05:10:00.359980 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb5gl\" (UniqueName: \"kubernetes.io/projected/0d267a81-ba86-4ddd-b83b-37ae171c6230-kube-api-access-jb5gl\") pod \"auto-csr-approver-29567830-lls59\" (UID: \"0d267a81-ba86-4ddd-b83b-37ae171c6230\") " pod="openshift-infra/auto-csr-approver-29567830-lls59" Mar 21 05:10:00 crc kubenswrapper[4580]: I0321 05:10:00.376756 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb5gl\" (UniqueName: \"kubernetes.io/projected/0d267a81-ba86-4ddd-b83b-37ae171c6230-kube-api-access-jb5gl\") pod \"auto-csr-approver-29567830-lls59\" (UID: \"0d267a81-ba86-4ddd-b83b-37ae171c6230\") " pod="openshift-infra/auto-csr-approver-29567830-lls59" Mar 21 05:10:00 crc kubenswrapper[4580]: I0321 05:10:00.456714 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-lls59" Mar 21 05:10:00 crc kubenswrapper[4580]: I0321 05:10:00.930425 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-lls59"] Mar 21 05:10:00 crc kubenswrapper[4580]: W0321 05:10:00.932077 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d267a81_ba86_4ddd_b83b_37ae171c6230.slice/crio-961091fb4a344c0be69ce27e79a4db28456299b6176da5ba3a0b594e24b4a462 WatchSource:0}: Error finding container 961091fb4a344c0be69ce27e79a4db28456299b6176da5ba3a0b594e24b4a462: Status 404 returned error can't find the container with id 961091fb4a344c0be69ce27e79a4db28456299b6176da5ba3a0b594e24b4a462 Mar 21 05:10:01 crc kubenswrapper[4580]: I0321 05:10:01.235445 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567830-lls59" event={"ID":"0d267a81-ba86-4ddd-b83b-37ae171c6230","Type":"ContainerStarted","Data":"961091fb4a344c0be69ce27e79a4db28456299b6176da5ba3a0b594e24b4a462"} Mar 21 05:10:02 crc kubenswrapper[4580]: I0321 05:10:02.244472 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567830-lls59" event={"ID":"0d267a81-ba86-4ddd-b83b-37ae171c6230","Type":"ContainerStarted","Data":"d8eb3f19c2c28fc41332dfdb63caf37a56dfbdabc374580e3c0b9f4828d22d4b"} Mar 21 05:10:02 crc kubenswrapper[4580]: I0321 05:10:02.269138 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567830-lls59" podStartSLOduration=1.293046698 podStartE2EDuration="2.269110849s" podCreationTimestamp="2026-03-21 05:10:00 +0000 UTC" firstStartedPulling="2026-03-21 05:10:00.933945826 +0000 UTC m=+1106.016529454" lastFinishedPulling="2026-03-21 05:10:01.910009967 +0000 UTC m=+1106.992593605" observedRunningTime="2026-03-21 05:10:02.264965679 +0000 UTC m=+1107.347549317" watchObservedRunningTime="2026-03-21 05:10:02.269110849 +0000 UTC m=+1107.351694467" Mar 21 05:10:03 crc kubenswrapper[4580]: I0321 05:10:03.252067 4580 generic.go:334] "Generic (PLEG): container finished" podID="0d267a81-ba86-4ddd-b83b-37ae171c6230" containerID="d8eb3f19c2c28fc41332dfdb63caf37a56dfbdabc374580e3c0b9f4828d22d4b" exitCode=0 Mar 21 05:10:03 crc kubenswrapper[4580]: I0321 05:10:03.252106 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567830-lls59" event={"ID":"0d267a81-ba86-4ddd-b83b-37ae171c6230","Type":"ContainerDied","Data":"d8eb3f19c2c28fc41332dfdb63caf37a56dfbdabc374580e3c0b9f4828d22d4b"} Mar 21 05:10:04 crc kubenswrapper[4580]: I0321 05:10:04.577601 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-lls59" Mar 21 05:10:04 crc kubenswrapper[4580]: I0321 05:10:04.723173 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb5gl\" (UniqueName: \"kubernetes.io/projected/0d267a81-ba86-4ddd-b83b-37ae171c6230-kube-api-access-jb5gl\") pod \"0d267a81-ba86-4ddd-b83b-37ae171c6230\" (UID: \"0d267a81-ba86-4ddd-b83b-37ae171c6230\") " Mar 21 05:10:04 crc kubenswrapper[4580]: I0321 05:10:04.731349 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d267a81-ba86-4ddd-b83b-37ae171c6230-kube-api-access-jb5gl" (OuterVolumeSpecName: "kube-api-access-jb5gl") pod "0d267a81-ba86-4ddd-b83b-37ae171c6230" (UID: "0d267a81-ba86-4ddd-b83b-37ae171c6230"). InnerVolumeSpecName "kube-api-access-jb5gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:04 crc kubenswrapper[4580]: I0321 05:10:04.825427 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb5gl\" (UniqueName: \"kubernetes.io/projected/0d267a81-ba86-4ddd-b83b-37ae171c6230-kube-api-access-jb5gl\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:05 crc kubenswrapper[4580]: I0321 05:10:05.266556 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567830-lls59" event={"ID":"0d267a81-ba86-4ddd-b83b-37ae171c6230","Type":"ContainerDied","Data":"961091fb4a344c0be69ce27e79a4db28456299b6176da5ba3a0b594e24b4a462"} Mar 21 05:10:05 crc kubenswrapper[4580]: I0321 05:10:05.266893 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="961091fb4a344c0be69ce27e79a4db28456299b6176da5ba3a0b594e24b4a462" Mar 21 05:10:05 crc kubenswrapper[4580]: I0321 05:10:05.266958 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-lls59" Mar 21 05:10:05 crc kubenswrapper[4580]: I0321 05:10:05.335007 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-52mdt"] Mar 21 05:10:05 crc kubenswrapper[4580]: I0321 05:10:05.339882 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-52mdt"] Mar 21 05:10:05 crc kubenswrapper[4580]: I0321 05:10:05.629187 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7c7def8-ae32-4d0f-93db-749023ea9f17" path="/var/lib/kubelet/pods/a7c7def8-ae32-4d0f-93db-749023ea9f17/volumes" Mar 21 05:10:09 crc kubenswrapper[4580]: I0321 05:10:09.068621 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-d5jll" Mar 21 05:10:15 crc kubenswrapper[4580]: I0321 05:10:15.947362 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:10:15 crc kubenswrapper[4580]: I0321 05:10:15.947879 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.296240 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-487x7"] Mar 21 05:10:25 crc kubenswrapper[4580]: E0321 05:10:25.297225 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d267a81-ba86-4ddd-b83b-37ae171c6230" containerName="oc" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.297260 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d267a81-ba86-4ddd-b83b-37ae171c6230" containerName="oc" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.297417 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d267a81-ba86-4ddd-b83b-37ae171c6230" containerName="oc" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.298284 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-487x7" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.302699 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.302870 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.302924 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.303041 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-tc69l" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.314887 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-487x7"] Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.344547 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sc9w6"] Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.346278 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.348654 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.353542 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/186a861f-3192-4cc7-bb53-e3002f0ed873-config\") pod \"dnsmasq-dns-675f4bcbfc-487x7\" (UID: \"186a861f-3192-4cc7-bb53-e3002f0ed873\") " pod="openstack/dnsmasq-dns-675f4bcbfc-487x7" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.353619 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtx69\" (UniqueName: \"kubernetes.io/projected/186a861f-3192-4cc7-bb53-e3002f0ed873-kube-api-access-vtx69\") pod \"dnsmasq-dns-675f4bcbfc-487x7\" (UID: \"186a861f-3192-4cc7-bb53-e3002f0ed873\") " pod="openstack/dnsmasq-dns-675f4bcbfc-487x7" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.377872 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sc9w6"] Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.454745 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtx69\" (UniqueName: \"kubernetes.io/projected/186a861f-3192-4cc7-bb53-e3002f0ed873-kube-api-access-vtx69\") pod \"dnsmasq-dns-675f4bcbfc-487x7\" (UID: \"186a861f-3192-4cc7-bb53-e3002f0ed873\") " pod="openstack/dnsmasq-dns-675f4bcbfc-487x7" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.454864 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91131bd-c741-4c09-9658-df3c3dc64e84-config\") pod \"dnsmasq-dns-78dd6ddcc-sc9w6\" (UID: \"b91131bd-c741-4c09-9658-df3c3dc64e84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.455006 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dns2m\" (UniqueName: \"kubernetes.io/projected/b91131bd-c741-4c09-9658-df3c3dc64e84-kube-api-access-dns2m\") pod \"dnsmasq-dns-78dd6ddcc-sc9w6\" (UID: \"b91131bd-c741-4c09-9658-df3c3dc64e84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.455047 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/186a861f-3192-4cc7-bb53-e3002f0ed873-config\") pod \"dnsmasq-dns-675f4bcbfc-487x7\" (UID: \"186a861f-3192-4cc7-bb53-e3002f0ed873\") " pod="openstack/dnsmasq-dns-675f4bcbfc-487x7" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.455097 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91131bd-c741-4c09-9658-df3c3dc64e84-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sc9w6\" (UID: \"b91131bd-c741-4c09-9658-df3c3dc64e84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.455953 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/186a861f-3192-4cc7-bb53-e3002f0ed873-config\") pod \"dnsmasq-dns-675f4bcbfc-487x7\" (UID: \"186a861f-3192-4cc7-bb53-e3002f0ed873\") " pod="openstack/dnsmasq-dns-675f4bcbfc-487x7" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.478321 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtx69\" (UniqueName: \"kubernetes.io/projected/186a861f-3192-4cc7-bb53-e3002f0ed873-kube-api-access-vtx69\") pod \"dnsmasq-dns-675f4bcbfc-487x7\" (UID: \"186a861f-3192-4cc7-bb53-e3002f0ed873\") " pod="openstack/dnsmasq-dns-675f4bcbfc-487x7" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.556450 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dns2m\" (UniqueName: \"kubernetes.io/projected/b91131bd-c741-4c09-9658-df3c3dc64e84-kube-api-access-dns2m\") pod \"dnsmasq-dns-78dd6ddcc-sc9w6\" (UID: \"b91131bd-c741-4c09-9658-df3c3dc64e84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.556513 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91131bd-c741-4c09-9658-df3c3dc64e84-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sc9w6\" (UID: \"b91131bd-c741-4c09-9658-df3c3dc64e84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.556563 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91131bd-c741-4c09-9658-df3c3dc64e84-config\") pod \"dnsmasq-dns-78dd6ddcc-sc9w6\" (UID: \"b91131bd-c741-4c09-9658-df3c3dc64e84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.557691 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91131bd-c741-4c09-9658-df3c3dc64e84-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sc9w6\" (UID: \"b91131bd-c741-4c09-9658-df3c3dc64e84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.557857 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91131bd-c741-4c09-9658-df3c3dc64e84-config\") pod \"dnsmasq-dns-78dd6ddcc-sc9w6\" (UID: \"b91131bd-c741-4c09-9658-df3c3dc64e84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.577731 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dns2m\" (UniqueName: \"kubernetes.io/projected/b91131bd-c741-4c09-9658-df3c3dc64e84-kube-api-access-dns2m\") pod \"dnsmasq-dns-78dd6ddcc-sc9w6\" (UID: \"b91131bd-c741-4c09-9658-df3c3dc64e84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.617603 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-487x7" Mar 21 05:10:25 crc kubenswrapper[4580]: I0321 05:10:25.666196 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" Mar 21 05:10:26 crc kubenswrapper[4580]: I0321 05:10:26.196708 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sc9w6"] Mar 21 05:10:26 crc kubenswrapper[4580]: W0321 05:10:26.209191 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod186a861f_3192_4cc7_bb53_e3002f0ed873.slice/crio-2a87920441961a719a7e336cc0b6ac93e4a36c512c29873aa580559a14b8d588 WatchSource:0}: Error finding container 2a87920441961a719a7e336cc0b6ac93e4a36c512c29873aa580559a14b8d588: Status 404 returned error can't find the container with id 2a87920441961a719a7e336cc0b6ac93e4a36c512c29873aa580559a14b8d588 Mar 21 05:10:26 crc kubenswrapper[4580]: I0321 05:10:26.209388 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-487x7"] Mar 21 05:10:26 crc kubenswrapper[4580]: I0321 05:10:26.424206 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-487x7" event={"ID":"186a861f-3192-4cc7-bb53-e3002f0ed873","Type":"ContainerStarted","Data":"2a87920441961a719a7e336cc0b6ac93e4a36c512c29873aa580559a14b8d588"} Mar 21 05:10:26 crc kubenswrapper[4580]: I0321 05:10:26.425433 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" event={"ID":"b91131bd-c741-4c09-9658-df3c3dc64e84","Type":"ContainerStarted","Data":"8ff35c15eefa22a416844acca313c9dc95c4f089d3ad52903f49f119967a816d"} Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.208603 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-487x7"] Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.246294 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-df9h2"] Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.247610 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.262975 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-df9h2"] Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.300131 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-config\") pod \"dnsmasq-dns-5ccc8479f9-df9h2\" (UID: \"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd\") " pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.300221 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-df9h2\" (UID: \"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd\") " pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.300246 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh4gs\" (UniqueName: \"kubernetes.io/projected/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-kube-api-access-sh4gs\") pod \"dnsmasq-dns-5ccc8479f9-df9h2\" (UID: \"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd\") " pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.401752 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh4gs\" (UniqueName: \"kubernetes.io/projected/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-kube-api-access-sh4gs\") pod \"dnsmasq-dns-5ccc8479f9-df9h2\" (UID: \"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd\") " pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.402119 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-config\") pod \"dnsmasq-dns-5ccc8479f9-df9h2\" (UID: \"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd\") " pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.402191 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-df9h2\" (UID: \"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd\") " pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.403769 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-df9h2\" (UID: \"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd\") " pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.405034 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-config\") pod \"dnsmasq-dns-5ccc8479f9-df9h2\" (UID: \"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd\") " pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.438218 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh4gs\" (UniqueName: \"kubernetes.io/projected/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-kube-api-access-sh4gs\") pod \"dnsmasq-dns-5ccc8479f9-df9h2\" (UID: \"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd\") " pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.567348 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sc9w6"] Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.573568 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.597385 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-69g76"] Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.601122 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-69g76" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.639193 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-69g76"] Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.706969 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41d2dc91-5cd7-44df-94cc-7a4f63a14193-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-69g76\" (UID: \"41d2dc91-5cd7-44df-94cc-7a4f63a14193\") " pod="openstack/dnsmasq-dns-57d769cc4f-69g76" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.707030 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41d2dc91-5cd7-44df-94cc-7a4f63a14193-config\") pod \"dnsmasq-dns-57d769cc4f-69g76\" (UID: \"41d2dc91-5cd7-44df-94cc-7a4f63a14193\") " pod="openstack/dnsmasq-dns-57d769cc4f-69g76" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.707490 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtg5r\" (UniqueName: \"kubernetes.io/projected/41d2dc91-5cd7-44df-94cc-7a4f63a14193-kube-api-access-rtg5r\") pod \"dnsmasq-dns-57d769cc4f-69g76\" (UID: \"41d2dc91-5cd7-44df-94cc-7a4f63a14193\") " pod="openstack/dnsmasq-dns-57d769cc4f-69g76" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.811460 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtg5r\" (UniqueName: \"kubernetes.io/projected/41d2dc91-5cd7-44df-94cc-7a4f63a14193-kube-api-access-rtg5r\") pod \"dnsmasq-dns-57d769cc4f-69g76\" (UID: \"41d2dc91-5cd7-44df-94cc-7a4f63a14193\") " pod="openstack/dnsmasq-dns-57d769cc4f-69g76" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.811520 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41d2dc91-5cd7-44df-94cc-7a4f63a14193-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-69g76\" (UID: \"41d2dc91-5cd7-44df-94cc-7a4f63a14193\") " pod="openstack/dnsmasq-dns-57d769cc4f-69g76" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.811552 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41d2dc91-5cd7-44df-94cc-7a4f63a14193-config\") pod \"dnsmasq-dns-57d769cc4f-69g76\" (UID: \"41d2dc91-5cd7-44df-94cc-7a4f63a14193\") " pod="openstack/dnsmasq-dns-57d769cc4f-69g76" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.812913 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41d2dc91-5cd7-44df-94cc-7a4f63a14193-config\") pod \"dnsmasq-dns-57d769cc4f-69g76\" (UID: \"41d2dc91-5cd7-44df-94cc-7a4f63a14193\") " pod="openstack/dnsmasq-dns-57d769cc4f-69g76" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.813908 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41d2dc91-5cd7-44df-94cc-7a4f63a14193-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-69g76\" (UID: \"41d2dc91-5cd7-44df-94cc-7a4f63a14193\") " pod="openstack/dnsmasq-dns-57d769cc4f-69g76" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.840574 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtg5r\" (UniqueName: \"kubernetes.io/projected/41d2dc91-5cd7-44df-94cc-7a4f63a14193-kube-api-access-rtg5r\") pod \"dnsmasq-dns-57d769cc4f-69g76\" (UID: \"41d2dc91-5cd7-44df-94cc-7a4f63a14193\") " pod="openstack/dnsmasq-dns-57d769cc4f-69g76" Mar 21 05:10:28 crc kubenswrapper[4580]: I0321 05:10:28.993023 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-69g76" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.118128 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-df9h2"] Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.429976 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.436732 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.446550 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.452554 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.452677 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.452737 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-9cjgq" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.452917 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.453032 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.453054 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.453193 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.492909 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" event={"ID":"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd","Type":"ContainerStarted","Data":"6bc21e09904c5a22e8b34e61cd10d0dabaea06099abd6c40a05fa4a1e99e497e"} Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.521435 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.521480 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.521516 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.521546 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.521570 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.521603 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.521623 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.521638 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ac0ed353-d343-4f14-804b-affb2f0cc4d6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.521668 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ac0ed353-d343-4f14-804b-affb2f0cc4d6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.521697 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.521726 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsfc8\" (UniqueName: \"kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-kube-api-access-tsfc8\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.624584 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.624651 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsfc8\" (UniqueName: \"kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-kube-api-access-tsfc8\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.624680 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.624696 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.624731 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.624766 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.624806 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.624844 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.624873 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.624888 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ac0ed353-d343-4f14-804b-affb2f0cc4d6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.624920 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ac0ed353-d343-4f14-804b-affb2f0cc4d6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.626405 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.626463 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.626994 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.627251 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.627556 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.628269 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.634355 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.640863 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ac0ed353-d343-4f14-804b-affb2f0cc4d6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.648957 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-69g76"] Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.648970 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsfc8\" (UniqueName: \"kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-kube-api-access-tsfc8\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.649481 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ac0ed353-d343-4f14-804b-affb2f0cc4d6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.662265 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.662752 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.761636 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.765008 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.770789 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.770941 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.771058 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.771169 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.771264 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.771361 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-h94bp" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.778230 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.800568 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.856521 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.936640 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.936727 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.936761 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-config-data\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.936803 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.936905 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.936943 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.936973 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-server-conf\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.937005 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-pod-info\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.937040 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px6ls\" (UniqueName: \"kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-kube-api-access-px6ls\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.937070 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:29 crc kubenswrapper[4580]: I0321 05:10:29.938387 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.041116 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-server-conf\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.041401 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-pod-info\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.041508 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px6ls\" (UniqueName: \"kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-kube-api-access-px6ls\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.042477 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.045314 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.045413 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.045480 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.045773 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-config-data\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.045832 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.045939 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.045994 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.046346 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.046843 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.047140 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-pod-info\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.047144 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-config-data\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.047738 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.048107 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.049342 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.051031 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.061145 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.063956 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px6ls\" (UniqueName: \"kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-kube-api-access-px6ls\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.102361 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-server-conf\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.143348 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.392418 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.519284 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.520055 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-69g76" event={"ID":"41d2dc91-5cd7-44df-94cc-7a4f63a14193","Type":"ContainerStarted","Data":"de6064ae6aff0c3ea45b4d7303b7730f8f8cb651d7a3366590d4bf4c00b62d6f"} Mar 21 05:10:30 crc kubenswrapper[4580]: W0321 05:10:30.541505 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac0ed353_d343_4f14_804b_affb2f0cc4d6.slice/crio-ff05058a223549e96d9ab65726160b47c29db63239e89bb285ba89a1a353f94a WatchSource:0}: Error finding container ff05058a223549e96d9ab65726160b47c29db63239e89bb285ba89a1a353f94a: Status 404 returned error can't find the container with id ff05058a223549e96d9ab65726160b47c29db63239e89bb285ba89a1a353f94a Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.802221 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.803601 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.808074 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-82dq9" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.808345 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.812553 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.812733 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.819323 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.819318 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.869887 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b4f4841a-f9ee-4d9d-b756-77cabd20363a-config-data-default\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.869934 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f4841a-f9ee-4d9d-b756-77cabd20363a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.869958 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4f4841a-f9ee-4d9d-b756-77cabd20363a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.869989 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4f4841a-f9ee-4d9d-b756-77cabd20363a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.870075 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.870110 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d85dm\" (UniqueName: \"kubernetes.io/projected/b4f4841a-f9ee-4d9d-b756-77cabd20363a-kube-api-access-d85dm\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.870179 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4f4841a-f9ee-4d9d-b756-77cabd20363a-kolla-config\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.870200 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b4f4841a-f9ee-4d9d-b756-77cabd20363a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.971945 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4f4841a-f9ee-4d9d-b756-77cabd20363a-kolla-config\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.972015 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b4f4841a-f9ee-4d9d-b756-77cabd20363a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.972057 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b4f4841a-f9ee-4d9d-b756-77cabd20363a-config-data-default\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.972091 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f4841a-f9ee-4d9d-b756-77cabd20363a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.972129 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4f4841a-f9ee-4d9d-b756-77cabd20363a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.972166 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4f4841a-f9ee-4d9d-b756-77cabd20363a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.972191 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.972227 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d85dm\" (UniqueName: \"kubernetes.io/projected/b4f4841a-f9ee-4d9d-b756-77cabd20363a-kube-api-access-d85dm\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.972630 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b4f4841a-f9ee-4d9d-b756-77cabd20363a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.972900 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4f4841a-f9ee-4d9d-b756-77cabd20363a-kolla-config\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.973654 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b4f4841a-f9ee-4d9d-b756-77cabd20363a-config-data-default\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.973798 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Mar 21 05:10:30 crc kubenswrapper[4580]: I0321 05:10:30.974068 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4f4841a-f9ee-4d9d-b756-77cabd20363a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:31 crc kubenswrapper[4580]: I0321 05:10:31.001175 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4f4841a-f9ee-4d9d-b756-77cabd20363a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:31 crc kubenswrapper[4580]: I0321 05:10:31.001198 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d85dm\" (UniqueName: \"kubernetes.io/projected/b4f4841a-f9ee-4d9d-b756-77cabd20363a-kube-api-access-d85dm\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:31 crc kubenswrapper[4580]: I0321 05:10:31.004003 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f4841a-f9ee-4d9d-b756-77cabd20363a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:31 crc kubenswrapper[4580]: I0321 05:10:31.008662 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"b4f4841a-f9ee-4d9d-b756-77cabd20363a\") " pod="openstack/openstack-galera-0" Mar 21 05:10:31 crc kubenswrapper[4580]: I0321 05:10:31.142346 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 21 05:10:31 crc kubenswrapper[4580]: I0321 05:10:31.188466 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:10:31 crc kubenswrapper[4580]: W0321 05:10:31.250221 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38ef0f25_a572_4eaf_95ed_07b7f6ffaeaf.slice/crio-f1c8689579ebf583eb3fef071adea86e79fbb3c5249e649fa8a7df56e847fde0 WatchSource:0}: Error finding container f1c8689579ebf583eb3fef071adea86e79fbb3c5249e649fa8a7df56e847fde0: Status 404 returned error can't find the container with id f1c8689579ebf583eb3fef071adea86e79fbb3c5249e649fa8a7df56e847fde0 Mar 21 05:10:31 crc kubenswrapper[4580]: I0321 05:10:31.589604 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ac0ed353-d343-4f14-804b-affb2f0cc4d6","Type":"ContainerStarted","Data":"ff05058a223549e96d9ab65726160b47c29db63239e89bb285ba89a1a353f94a"} Mar 21 05:10:31 crc kubenswrapper[4580]: I0321 05:10:31.600580 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf","Type":"ContainerStarted","Data":"f1c8689579ebf583eb3fef071adea86e79fbb3c5249e649fa8a7df56e847fde0"} Mar 21 05:10:31 crc kubenswrapper[4580]: I0321 05:10:31.949129 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.221847 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.223176 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.226440 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-4zs28" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.226692 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.227017 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.234830 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.236967 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.402553 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2da281c0-51d3-4264-8924-83dbc85ecbf0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.402598 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2da281c0-51d3-4264-8924-83dbc85ecbf0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.402631 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.402647 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da281c0-51d3-4264-8924-83dbc85ecbf0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.402668 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2da281c0-51d3-4264-8924-83dbc85ecbf0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.402714 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2da281c0-51d3-4264-8924-83dbc85ecbf0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.402731 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkvzs\" (UniqueName: \"kubernetes.io/projected/2da281c0-51d3-4264-8924-83dbc85ecbf0-kube-api-access-zkvzs\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.402746 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2da281c0-51d3-4264-8924-83dbc85ecbf0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.507040 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2da281c0-51d3-4264-8924-83dbc85ecbf0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.507096 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2da281c0-51d3-4264-8924-83dbc85ecbf0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.507135 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.507154 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da281c0-51d3-4264-8924-83dbc85ecbf0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.507170 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2da281c0-51d3-4264-8924-83dbc85ecbf0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.507216 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2da281c0-51d3-4264-8924-83dbc85ecbf0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.507233 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2da281c0-51d3-4264-8924-83dbc85ecbf0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.507248 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkvzs\" (UniqueName: \"kubernetes.io/projected/2da281c0-51d3-4264-8924-83dbc85ecbf0-kube-api-access-zkvzs\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.509053 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2da281c0-51d3-4264-8924-83dbc85ecbf0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.509358 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2da281c0-51d3-4264-8924-83dbc85ecbf0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.509564 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2da281c0-51d3-4264-8924-83dbc85ecbf0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.510060 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.511423 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2da281c0-51d3-4264-8924-83dbc85ecbf0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.547612 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2da281c0-51d3-4264-8924-83dbc85ecbf0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.552965 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da281c0-51d3-4264-8924-83dbc85ecbf0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.566919 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.571237 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.572521 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.581628 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.581851 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-xnnld" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.583703 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.584444 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkvzs\" (UniqueName: \"kubernetes.io/projected/2da281c0-51d3-4264-8924-83dbc85ecbf0-kube-api-access-zkvzs\") pod \"openstack-cell1-galera-0\" (UID: \"2da281c0-51d3-4264-8924-83dbc85ecbf0\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.591740 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.682029 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b4f4841a-f9ee-4d9d-b756-77cabd20363a","Type":"ContainerStarted","Data":"c344eed6e06090acfdb809f2eacdc6993d646b3fa6424dc9e2876306007bf4cd"} Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.722858 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226921bf-412a-4dc6-a722-3fcf5ecc7fdc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"226921bf-412a-4dc6-a722-3fcf5ecc7fdc\") " pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.722942 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/226921bf-412a-4dc6-a722-3fcf5ecc7fdc-kolla-config\") pod \"memcached-0\" (UID: \"226921bf-412a-4dc6-a722-3fcf5ecc7fdc\") " pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.723195 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/226921bf-412a-4dc6-a722-3fcf5ecc7fdc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"226921bf-412a-4dc6-a722-3fcf5ecc7fdc\") " pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.723409 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/226921bf-412a-4dc6-a722-3fcf5ecc7fdc-config-data\") pod \"memcached-0\" (UID: \"226921bf-412a-4dc6-a722-3fcf5ecc7fdc\") " pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.723465 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjf45\" (UniqueName: \"kubernetes.io/projected/226921bf-412a-4dc6-a722-3fcf5ecc7fdc-kube-api-access-zjf45\") pod \"memcached-0\" (UID: \"226921bf-412a-4dc6-a722-3fcf5ecc7fdc\") " pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.833081 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/226921bf-412a-4dc6-a722-3fcf5ecc7fdc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"226921bf-412a-4dc6-a722-3fcf5ecc7fdc\") " pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.833373 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/226921bf-412a-4dc6-a722-3fcf5ecc7fdc-config-data\") pod \"memcached-0\" (UID: \"226921bf-412a-4dc6-a722-3fcf5ecc7fdc\") " pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.834404 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjf45\" (UniqueName: \"kubernetes.io/projected/226921bf-412a-4dc6-a722-3fcf5ecc7fdc-kube-api-access-zjf45\") pod \"memcached-0\" (UID: \"226921bf-412a-4dc6-a722-3fcf5ecc7fdc\") " pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.834445 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/226921bf-412a-4dc6-a722-3fcf5ecc7fdc-config-data\") pod \"memcached-0\" (UID: \"226921bf-412a-4dc6-a722-3fcf5ecc7fdc\") " pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.834618 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226921bf-412a-4dc6-a722-3fcf5ecc7fdc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"226921bf-412a-4dc6-a722-3fcf5ecc7fdc\") " pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.834682 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/226921bf-412a-4dc6-a722-3fcf5ecc7fdc-kolla-config\") pod \"memcached-0\" (UID: \"226921bf-412a-4dc6-a722-3fcf5ecc7fdc\") " pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.835424 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/226921bf-412a-4dc6-a722-3fcf5ecc7fdc-kolla-config\") pod \"memcached-0\" (UID: \"226921bf-412a-4dc6-a722-3fcf5ecc7fdc\") " pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.851117 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226921bf-412a-4dc6-a722-3fcf5ecc7fdc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"226921bf-412a-4dc6-a722-3fcf5ecc7fdc\") " pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.853814 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjf45\" (UniqueName: \"kubernetes.io/projected/226921bf-412a-4dc6-a722-3fcf5ecc7fdc-kube-api-access-zjf45\") pod \"memcached-0\" (UID: \"226921bf-412a-4dc6-a722-3fcf5ecc7fdc\") " pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.855010 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/226921bf-412a-4dc6-a722-3fcf5ecc7fdc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"226921bf-412a-4dc6-a722-3fcf5ecc7fdc\") " pod="openstack/memcached-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.878010 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 21 05:10:32 crc kubenswrapper[4580]: I0321 05:10:32.957744 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 21 05:10:33 crc kubenswrapper[4580]: I0321 05:10:33.799343 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 21 05:10:33 crc kubenswrapper[4580]: I0321 05:10:33.896132 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 21 05:10:33 crc kubenswrapper[4580]: W0321 05:10:33.920926 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod226921bf_412a_4dc6_a722_3fcf5ecc7fdc.slice/crio-3e793bfdf78b57fed146900bbf6f93e1449d8d92c3ead430d30e06c123a04859 WatchSource:0}: Error finding container 3e793bfdf78b57fed146900bbf6f93e1449d8d92c3ead430d30e06c123a04859: Status 404 returned error can't find the container with id 3e793bfdf78b57fed146900bbf6f93e1449d8d92c3ead430d30e06c123a04859 Mar 21 05:10:34 crc kubenswrapper[4580]: I0321 05:10:34.752874 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"226921bf-412a-4dc6-a722-3fcf5ecc7fdc","Type":"ContainerStarted","Data":"3e793bfdf78b57fed146900bbf6f93e1449d8d92c3ead430d30e06c123a04859"} Mar 21 05:10:34 crc kubenswrapper[4580]: I0321 05:10:34.770523 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2da281c0-51d3-4264-8924-83dbc85ecbf0","Type":"ContainerStarted","Data":"504e9b365147b9be8f174f8a6925bb1111aed97e5e26f60ad8950e4278089326"} Mar 21 05:10:35 crc kubenswrapper[4580]: I0321 05:10:34.999319 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:10:35 crc kubenswrapper[4580]: I0321 05:10:35.005601 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 05:10:35 crc kubenswrapper[4580]: I0321 05:10:35.010750 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-szd8w" Mar 21 05:10:35 crc kubenswrapper[4580]: I0321 05:10:35.011752 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:10:35 crc kubenswrapper[4580]: I0321 05:10:35.113561 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7fn9\" (UniqueName: \"kubernetes.io/projected/caa7b0b4-ac59-4338-896e-723db48b3d24-kube-api-access-t7fn9\") pod \"kube-state-metrics-0\" (UID: \"caa7b0b4-ac59-4338-896e-723db48b3d24\") " pod="openstack/kube-state-metrics-0" Mar 21 05:10:35 crc kubenswrapper[4580]: I0321 05:10:35.215209 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7fn9\" (UniqueName: \"kubernetes.io/projected/caa7b0b4-ac59-4338-896e-723db48b3d24-kube-api-access-t7fn9\") pod \"kube-state-metrics-0\" (UID: \"caa7b0b4-ac59-4338-896e-723db48b3d24\") " pod="openstack/kube-state-metrics-0" Mar 21 05:10:35 crc kubenswrapper[4580]: I0321 05:10:35.257025 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7fn9\" (UniqueName: \"kubernetes.io/projected/caa7b0b4-ac59-4338-896e-723db48b3d24-kube-api-access-t7fn9\") pod \"kube-state-metrics-0\" (UID: \"caa7b0b4-ac59-4338-896e-723db48b3d24\") " pod="openstack/kube-state-metrics-0" Mar 21 05:10:35 crc kubenswrapper[4580]: I0321 05:10:35.343988 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 05:10:37 crc kubenswrapper[4580]: I0321 05:10:37.984010 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jjv5q"] Mar 21 05:10:37 crc kubenswrapper[4580]: I0321 05:10:37.985687 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:37 crc kubenswrapper[4580]: I0321 05:10:37.992913 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-686gt" Mar 21 05:10:37 crc kubenswrapper[4580]: I0321 05:10:37.993200 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 21 05:10:37 crc kubenswrapper[4580]: I0321 05:10:37.993339 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.052223 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tqfdg"] Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.059721 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.102719 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15016044-062f-44bc-8278-97a43b709083-scripts\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.105595 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15016044-062f-44bc-8278-97a43b709083-combined-ca-bundle\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.106649 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbg47\" (UniqueName: \"kubernetes.io/projected/15016044-062f-44bc-8278-97a43b709083-kube-api-access-rbg47\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.107682 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/15016044-062f-44bc-8278-97a43b709083-var-log-ovn\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.107859 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15016044-062f-44bc-8278-97a43b709083-var-run\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.108128 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/893ab010-283a-4331-834a-05586719a352-var-run\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.108896 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/893ab010-283a-4331-834a-05586719a352-var-lib\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.109971 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/893ab010-283a-4331-834a-05586719a352-scripts\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.110144 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/15016044-062f-44bc-8278-97a43b709083-ovn-controller-tls-certs\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.110270 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqwbj\" (UniqueName: \"kubernetes.io/projected/893ab010-283a-4331-834a-05586719a352-kube-api-access-vqwbj\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.110399 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/893ab010-283a-4331-834a-05586719a352-etc-ovs\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.123267 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/893ab010-283a-4331-834a-05586719a352-var-log\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.123326 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/15016044-062f-44bc-8278-97a43b709083-var-run-ovn\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.131864 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jjv5q"] Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.149287 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tqfdg"] Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.224446 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/893ab010-283a-4331-834a-05586719a352-scripts\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.224490 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/15016044-062f-44bc-8278-97a43b709083-ovn-controller-tls-certs\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.224510 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqwbj\" (UniqueName: \"kubernetes.io/projected/893ab010-283a-4331-834a-05586719a352-kube-api-access-vqwbj\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.224534 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/893ab010-283a-4331-834a-05586719a352-etc-ovs\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.224549 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/893ab010-283a-4331-834a-05586719a352-var-log\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.224564 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/15016044-062f-44bc-8278-97a43b709083-var-run-ovn\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.224995 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/893ab010-283a-4331-834a-05586719a352-etc-ovs\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.225069 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/893ab010-283a-4331-834a-05586719a352-var-log\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.225192 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/15016044-062f-44bc-8278-97a43b709083-var-run-ovn\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.225255 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15016044-062f-44bc-8278-97a43b709083-scripts\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.225273 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15016044-062f-44bc-8278-97a43b709083-combined-ca-bundle\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.225291 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbg47\" (UniqueName: \"kubernetes.io/projected/15016044-062f-44bc-8278-97a43b709083-kube-api-access-rbg47\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.225322 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/15016044-062f-44bc-8278-97a43b709083-var-log-ovn\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.225338 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15016044-062f-44bc-8278-97a43b709083-var-run\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.225357 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/893ab010-283a-4331-834a-05586719a352-var-run\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.225380 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/893ab010-283a-4331-834a-05586719a352-var-lib\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.225528 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/893ab010-283a-4331-834a-05586719a352-var-lib\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.226159 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/15016044-062f-44bc-8278-97a43b709083-var-log-ovn\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.226717 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15016044-062f-44bc-8278-97a43b709083-var-run\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.226769 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/893ab010-283a-4331-834a-05586719a352-var-run\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.229988 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15016044-062f-44bc-8278-97a43b709083-scripts\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.234664 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/15016044-062f-44bc-8278-97a43b709083-ovn-controller-tls-certs\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.246225 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/893ab010-283a-4331-834a-05586719a352-scripts\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.255098 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqwbj\" (UniqueName: \"kubernetes.io/projected/893ab010-283a-4331-834a-05586719a352-kube-api-access-vqwbj\") pod \"ovn-controller-ovs-tqfdg\" (UID: \"893ab010-283a-4331-834a-05586719a352\") " pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.260345 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbg47\" (UniqueName: \"kubernetes.io/projected/15016044-062f-44bc-8278-97a43b709083-kube-api-access-rbg47\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.261097 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15016044-062f-44bc-8278-97a43b709083-combined-ca-bundle\") pod \"ovn-controller-jjv5q\" (UID: \"15016044-062f-44bc-8278-97a43b709083\") " pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.316848 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jjv5q" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.430275 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.815021 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.818247 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.820622 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-fgzjb" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.822947 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.823142 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.823318 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.823481 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.823487 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.937409 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/358c9476-8608-43e6-9912-6be4fb3f2ba8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.937460 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/358c9476-8608-43e6-9912-6be4fb3f2ba8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.937499 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/358c9476-8608-43e6-9912-6be4fb3f2ba8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.937519 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.937571 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/358c9476-8608-43e6-9912-6be4fb3f2ba8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.937602 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/358c9476-8608-43e6-9912-6be4fb3f2ba8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.937629 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcfd6\" (UniqueName: \"kubernetes.io/projected/358c9476-8608-43e6-9912-6be4fb3f2ba8-kube-api-access-zcfd6\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:38 crc kubenswrapper[4580]: I0321 05:10:38.937645 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358c9476-8608-43e6-9912-6be4fb3f2ba8-config\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.039630 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/358c9476-8608-43e6-9912-6be4fb3f2ba8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.041320 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/358c9476-8608-43e6-9912-6be4fb3f2ba8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.041400 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/358c9476-8608-43e6-9912-6be4fb3f2ba8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.041435 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.041573 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/358c9476-8608-43e6-9912-6be4fb3f2ba8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.041628 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/358c9476-8608-43e6-9912-6be4fb3f2ba8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.041718 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcfd6\" (UniqueName: \"kubernetes.io/projected/358c9476-8608-43e6-9912-6be4fb3f2ba8-kube-api-access-zcfd6\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.041744 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358c9476-8608-43e6-9912-6be4fb3f2ba8-config\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.041838 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.042756 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/358c9476-8608-43e6-9912-6be4fb3f2ba8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.042824 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/358c9476-8608-43e6-9912-6be4fb3f2ba8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.045211 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358c9476-8608-43e6-9912-6be4fb3f2ba8-config\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.048251 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/358c9476-8608-43e6-9912-6be4fb3f2ba8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.052858 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/358c9476-8608-43e6-9912-6be4fb3f2ba8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.076479 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/358c9476-8608-43e6-9912-6be4fb3f2ba8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.078808 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcfd6\" (UniqueName: \"kubernetes.io/projected/358c9476-8608-43e6-9912-6be4fb3f2ba8-kube-api-access-zcfd6\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.115507 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"358c9476-8608-43e6-9912-6be4fb3f2ba8\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:39 crc kubenswrapper[4580]: I0321 05:10:39.165411 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 21 05:10:41 crc kubenswrapper[4580]: I0321 05:10:41.872633 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 21 05:10:41 crc kubenswrapper[4580]: I0321 05:10:41.878870 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:41 crc kubenswrapper[4580]: I0321 05:10:41.882170 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 21 05:10:41 crc kubenswrapper[4580]: I0321 05:10:41.882378 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 21 05:10:41 crc kubenswrapper[4580]: I0321 05:10:41.882499 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-5xbqk" Mar 21 05:10:41 crc kubenswrapper[4580]: I0321 05:10:41.882688 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 21 05:10:41 crc kubenswrapper[4580]: I0321 05:10:41.899825 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.022688 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514b5967-88ad-43e2-aa38-88551fba381d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.022760 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.022818 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/514b5967-88ad-43e2-aa38-88551fba381d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.022924 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/514b5967-88ad-43e2-aa38-88551fba381d-config\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.022970 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/514b5967-88ad-43e2-aa38-88551fba381d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.022991 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/514b5967-88ad-43e2-aa38-88551fba381d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.023023 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zmdt\" (UniqueName: \"kubernetes.io/projected/514b5967-88ad-43e2-aa38-88551fba381d-kube-api-access-7zmdt\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.023068 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/514b5967-88ad-43e2-aa38-88551fba381d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.125935 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.126249 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/514b5967-88ad-43e2-aa38-88551fba381d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.126203 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.126733 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/514b5967-88ad-43e2-aa38-88551fba381d-config\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.126765 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/514b5967-88ad-43e2-aa38-88551fba381d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.126802 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/514b5967-88ad-43e2-aa38-88551fba381d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.126835 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zmdt\" (UniqueName: \"kubernetes.io/projected/514b5967-88ad-43e2-aa38-88551fba381d-kube-api-access-7zmdt\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.126872 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/514b5967-88ad-43e2-aa38-88551fba381d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.126937 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514b5967-88ad-43e2-aa38-88551fba381d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.127649 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/514b5967-88ad-43e2-aa38-88551fba381d-config\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.131094 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/514b5967-88ad-43e2-aa38-88551fba381d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.131366 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/514b5967-88ad-43e2-aa38-88551fba381d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.133920 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514b5967-88ad-43e2-aa38-88551fba381d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.133999 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/514b5967-88ad-43e2-aa38-88551fba381d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.146178 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/514b5967-88ad-43e2-aa38-88551fba381d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.147262 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zmdt\" (UniqueName: \"kubernetes.io/projected/514b5967-88ad-43e2-aa38-88551fba381d-kube-api-access-7zmdt\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.147977 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"514b5967-88ad-43e2-aa38-88551fba381d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:42 crc kubenswrapper[4580]: I0321 05:10:42.203041 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 21 05:10:45 crc kubenswrapper[4580]: I0321 05:10:45.947315 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:10:45 crc kubenswrapper[4580]: I0321 05:10:45.947587 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:10:52 crc kubenswrapper[4580]: I0321 05:10:52.113622 4580 scope.go:117] "RemoveContainer" containerID="ceb20283c2e8e490de1e43528dc0099be0fba578cd24941750b1143b6d33fb17" Mar 21 05:10:55 crc kubenswrapper[4580]: E0321 05:10:55.068901 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Mar 21 05:10:55 crc kubenswrapper[4580]: E0321 05:10:55.069306 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d85dm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(b4f4841a-f9ee-4d9d-b756-77cabd20363a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:10:55 crc kubenswrapper[4580]: E0321 05:10:55.070465 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="b4f4841a-f9ee-4d9d-b756-77cabd20363a" Mar 21 05:10:56 crc kubenswrapper[4580]: E0321 05:10:55.999463 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="b4f4841a-f9ee-4d9d-b756-77cabd20363a" Mar 21 05:10:56 crc kubenswrapper[4580]: E0321 05:10:56.075207 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Mar 21 05:10:56 crc kubenswrapper[4580]: E0321 05:10:56.075397 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n58ch549h5cfh64ch596hf5h69h665h68bh96h559h5d6h58dh594h5b5h645h5cch5d8h697h57dhch699h58ch59ch65dh649h688h664h559hfchbbh68q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zjf45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(226921bf-412a-4dc6-a722-3fcf5ecc7fdc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:10:56 crc kubenswrapper[4580]: E0321 05:10:56.076980 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="226921bf-412a-4dc6-a722-3fcf5ecc7fdc" Mar 21 05:10:57 crc kubenswrapper[4580]: E0321 05:10:57.007060 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="226921bf-412a-4dc6-a722-3fcf5ecc7fdc" Mar 21 05:10:57 crc kubenswrapper[4580]: E0321 05:10:57.076195 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 21 05:10:57 crc kubenswrapper[4580]: E0321 05:10:57.076338 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sh4gs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-df9h2_openstack(58549e5a-e3fb-4f2e-ac39-999a8dfb3efd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:10:57 crc kubenswrapper[4580]: E0321 05:10:57.077644 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" podUID="58549e5a-e3fb-4f2e-ac39-999a8dfb3efd" Mar 21 05:10:57 crc kubenswrapper[4580]: E0321 05:10:57.091013 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 21 05:10:57 crc kubenswrapper[4580]: E0321 05:10:57.091233 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rtg5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-69g76_openstack(41d2dc91-5cd7-44df-94cc-7a4f63a14193): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:10:57 crc kubenswrapper[4580]: E0321 05:10:57.093195 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-69g76" podUID="41d2dc91-5cd7-44df-94cc-7a4f63a14193" Mar 21 05:10:57 crc kubenswrapper[4580]: E0321 05:10:57.105142 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 21 05:10:57 crc kubenswrapper[4580]: E0321 05:10:57.105309 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dns2m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-sc9w6_openstack(b91131bd-c741-4c09-9658-df3c3dc64e84): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:10:57 crc kubenswrapper[4580]: E0321 05:10:57.106512 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" podUID="b91131bd-c741-4c09-9658-df3c3dc64e84" Mar 21 05:10:57 crc kubenswrapper[4580]: E0321 05:10:57.241813 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 21 05:10:57 crc kubenswrapper[4580]: E0321 05:10:57.241942 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vtx69,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-487x7_openstack(186a861f-3192-4cc7-bb53-e3002f0ed873): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:10:57 crc kubenswrapper[4580]: E0321 05:10:57.243087 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-487x7" podUID="186a861f-3192-4cc7-bb53-e3002f0ed873" Mar 21 05:10:57 crc kubenswrapper[4580]: I0321 05:10:57.741892 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jjv5q"] Mar 21 05:10:57 crc kubenswrapper[4580]: I0321 05:10:57.807766 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.013237 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"caa7b0b4-ac59-4338-896e-723db48b3d24","Type":"ContainerStarted","Data":"15c1b6ce88564a1603e9486fd67ed1d7e575fcf7ea2241910d51a6113a686dee"} Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.014947 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jjv5q" event={"ID":"15016044-062f-44bc-8278-97a43b709083","Type":"ContainerStarted","Data":"ecf4026987e8af267edbd2536ef2c97fc6d34644edc0ea356ea33b734fc2fbf0"} Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.017400 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2da281c0-51d3-4264-8924-83dbc85ecbf0","Type":"ContainerStarted","Data":"e716bf087ffd37acdbe79d9806c5919ea5f5dc237e0b1ec46d3ee5106548ad8e"} Mar 21 05:10:58 crc kubenswrapper[4580]: E0321 05:10:58.018840 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" podUID="58549e5a-e3fb-4f2e-ac39-999a8dfb3efd" Mar 21 05:10:58 crc kubenswrapper[4580]: E0321 05:10:58.020125 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-69g76" podUID="41d2dc91-5cd7-44df-94cc-7a4f63a14193" Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.349057 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 21 05:10:58 crc kubenswrapper[4580]: W0321 05:10:58.565131 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod358c9476_8608_43e6_9912_6be4fb3f2ba8.slice/crio-155a3091550d0e29504aae5fedc8610617c1fa4b69524a632971bde74d2111f9 WatchSource:0}: Error finding container 155a3091550d0e29504aae5fedc8610617c1fa4b69524a632971bde74d2111f9: Status 404 returned error can't find the container with id 155a3091550d0e29504aae5fedc8610617c1fa4b69524a632971bde74d2111f9 Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.634688 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.659540 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-487x7" Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.702459 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91131bd-c741-4c09-9658-df3c3dc64e84-config\") pod \"b91131bd-c741-4c09-9658-df3c3dc64e84\" (UID: \"b91131bd-c741-4c09-9658-df3c3dc64e84\") " Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.702635 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/186a861f-3192-4cc7-bb53-e3002f0ed873-config\") pod \"186a861f-3192-4cc7-bb53-e3002f0ed873\" (UID: \"186a861f-3192-4cc7-bb53-e3002f0ed873\") " Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.702704 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dns2m\" (UniqueName: \"kubernetes.io/projected/b91131bd-c741-4c09-9658-df3c3dc64e84-kube-api-access-dns2m\") pod \"b91131bd-c741-4c09-9658-df3c3dc64e84\" (UID: \"b91131bd-c741-4c09-9658-df3c3dc64e84\") " Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.702723 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91131bd-c741-4c09-9658-df3c3dc64e84-dns-svc\") pod \"b91131bd-c741-4c09-9658-df3c3dc64e84\" (UID: \"b91131bd-c741-4c09-9658-df3c3dc64e84\") " Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.702813 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtx69\" (UniqueName: \"kubernetes.io/projected/186a861f-3192-4cc7-bb53-e3002f0ed873-kube-api-access-vtx69\") pod \"186a861f-3192-4cc7-bb53-e3002f0ed873\" (UID: \"186a861f-3192-4cc7-bb53-e3002f0ed873\") " Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.705861 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186a861f-3192-4cc7-bb53-e3002f0ed873-config" (OuterVolumeSpecName: "config") pod "186a861f-3192-4cc7-bb53-e3002f0ed873" (UID: "186a861f-3192-4cc7-bb53-e3002f0ed873"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.706301 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91131bd-c741-4c09-9658-df3c3dc64e84-config" (OuterVolumeSpecName: "config") pod "b91131bd-c741-4c09-9658-df3c3dc64e84" (UID: "b91131bd-c741-4c09-9658-df3c3dc64e84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.706418 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91131bd-c741-4c09-9658-df3c3dc64e84-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b91131bd-c741-4c09-9658-df3c3dc64e84" (UID: "b91131bd-c741-4c09-9658-df3c3dc64e84"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.710805 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/186a861f-3192-4cc7-bb53-e3002f0ed873-kube-api-access-vtx69" (OuterVolumeSpecName: "kube-api-access-vtx69") pod "186a861f-3192-4cc7-bb53-e3002f0ed873" (UID: "186a861f-3192-4cc7-bb53-e3002f0ed873"). InnerVolumeSpecName "kube-api-access-vtx69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.723821 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b91131bd-c741-4c09-9658-df3c3dc64e84-kube-api-access-dns2m" (OuterVolumeSpecName: "kube-api-access-dns2m") pod "b91131bd-c741-4c09-9658-df3c3dc64e84" (UID: "b91131bd-c741-4c09-9658-df3c3dc64e84"). InnerVolumeSpecName "kube-api-access-dns2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.805574 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dns2m\" (UniqueName: \"kubernetes.io/projected/b91131bd-c741-4c09-9658-df3c3dc64e84-kube-api-access-dns2m\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.805630 4580 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91131bd-c741-4c09-9658-df3c3dc64e84-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.805644 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtx69\" (UniqueName: \"kubernetes.io/projected/186a861f-3192-4cc7-bb53-e3002f0ed873-kube-api-access-vtx69\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.805657 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91131bd-c741-4c09-9658-df3c3dc64e84-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.805671 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/186a861f-3192-4cc7-bb53-e3002f0ed873-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:58 crc kubenswrapper[4580]: I0321 05:10:58.982606 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 21 05:10:59 crc kubenswrapper[4580]: I0321 05:10:59.028453 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"358c9476-8608-43e6-9912-6be4fb3f2ba8","Type":"ContainerStarted","Data":"155a3091550d0e29504aae5fedc8610617c1fa4b69524a632971bde74d2111f9"} Mar 21 05:10:59 crc kubenswrapper[4580]: I0321 05:10:59.031011 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-487x7" Mar 21 05:10:59 crc kubenswrapper[4580]: I0321 05:10:59.031029 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-487x7" event={"ID":"186a861f-3192-4cc7-bb53-e3002f0ed873","Type":"ContainerDied","Data":"2a87920441961a719a7e336cc0b6ac93e4a36c512c29873aa580559a14b8d588"} Mar 21 05:10:59 crc kubenswrapper[4580]: I0321 05:10:59.035744 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ac0ed353-d343-4f14-804b-affb2f0cc4d6","Type":"ContainerStarted","Data":"517c3f090ecfb64bc2b909bb5db8ba938766cb21ee2ca89f76f4bc37007b4577"} Mar 21 05:10:59 crc kubenswrapper[4580]: I0321 05:10:59.041075 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" event={"ID":"b91131bd-c741-4c09-9658-df3c3dc64e84","Type":"ContainerDied","Data":"8ff35c15eefa22a416844acca313c9dc95c4f089d3ad52903f49f119967a816d"} Mar 21 05:10:59 crc kubenswrapper[4580]: I0321 05:10:59.041136 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sc9w6" Mar 21 05:10:59 crc kubenswrapper[4580]: I0321 05:10:59.049017 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf","Type":"ContainerStarted","Data":"4d38eef5a720f735d000140426f0b792c62476df0bed0a49cfb7c06e936f571f"} Mar 21 05:10:59 crc kubenswrapper[4580]: I0321 05:10:59.235619 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-487x7"] Mar 21 05:10:59 crc kubenswrapper[4580]: I0321 05:10:59.243895 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-487x7"] Mar 21 05:10:59 crc kubenswrapper[4580]: I0321 05:10:59.263874 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sc9w6"] Mar 21 05:10:59 crc kubenswrapper[4580]: I0321 05:10:59.268836 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sc9w6"] Mar 21 05:10:59 crc kubenswrapper[4580]: I0321 05:10:59.273603 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tqfdg"] Mar 21 05:10:59 crc kubenswrapper[4580]: W0321 05:10:59.473324 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod893ab010_283a_4331_834a_05586719a352.slice/crio-e681b0e92666b734b06e1cc41930ea4b98f836300f3b8a55cdb948aed9bd9c15 WatchSource:0}: Error finding container e681b0e92666b734b06e1cc41930ea4b98f836300f3b8a55cdb948aed9bd9c15: Status 404 returned error can't find the container with id e681b0e92666b734b06e1cc41930ea4b98f836300f3b8a55cdb948aed9bd9c15 Mar 21 05:10:59 crc kubenswrapper[4580]: I0321 05:10:59.638397 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="186a861f-3192-4cc7-bb53-e3002f0ed873" path="/var/lib/kubelet/pods/186a861f-3192-4cc7-bb53-e3002f0ed873/volumes" Mar 21 05:10:59 crc kubenswrapper[4580]: I0321 05:10:59.638905 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b91131bd-c741-4c09-9658-df3c3dc64e84" path="/var/lib/kubelet/pods/b91131bd-c741-4c09-9658-df3c3dc64e84/volumes" Mar 21 05:11:00 crc kubenswrapper[4580]: I0321 05:11:00.058770 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"514b5967-88ad-43e2-aa38-88551fba381d","Type":"ContainerStarted","Data":"9b4da7b4588d054c87fb613aba603630fb91410092a840c18a452970fe8be49e"} Mar 21 05:11:00 crc kubenswrapper[4580]: I0321 05:11:00.060929 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tqfdg" event={"ID":"893ab010-283a-4331-834a-05586719a352","Type":"ContainerStarted","Data":"e681b0e92666b734b06e1cc41930ea4b98f836300f3b8a55cdb948aed9bd9c15"} Mar 21 05:11:02 crc kubenswrapper[4580]: I0321 05:11:02.075323 4580 generic.go:334] "Generic (PLEG): container finished" podID="2da281c0-51d3-4264-8924-83dbc85ecbf0" containerID="e716bf087ffd37acdbe79d9806c5919ea5f5dc237e0b1ec46d3ee5106548ad8e" exitCode=0 Mar 21 05:11:02 crc kubenswrapper[4580]: I0321 05:11:02.075563 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2da281c0-51d3-4264-8924-83dbc85ecbf0","Type":"ContainerDied","Data":"e716bf087ffd37acdbe79d9806c5919ea5f5dc237e0b1ec46d3ee5106548ad8e"} Mar 21 05:11:04 crc kubenswrapper[4580]: I0321 05:11:04.093967 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2da281c0-51d3-4264-8924-83dbc85ecbf0","Type":"ContainerStarted","Data":"31c6e2659e30377e60eaefc1f992371591553b1552c7d2ec5d351d6b6a389d6b"} Mar 21 05:11:04 crc kubenswrapper[4580]: I0321 05:11:04.095926 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"caa7b0b4-ac59-4338-896e-723db48b3d24","Type":"ContainerStarted","Data":"6022b48e0c0eb828b77506fcdd071acca3447e151624bfca742cdcf31c26cd9b"} Mar 21 05:11:04 crc kubenswrapper[4580]: I0321 05:11:04.097188 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 21 05:11:04 crc kubenswrapper[4580]: I0321 05:11:04.102629 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"358c9476-8608-43e6-9912-6be4fb3f2ba8","Type":"ContainerStarted","Data":"8c8b1c7c6aaccc6b5894dc93751d261fe53fd0523fd31251f60a0ef96b6e62c5"} Mar 21 05:11:04 crc kubenswrapper[4580]: I0321 05:11:04.108055 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"514b5967-88ad-43e2-aa38-88551fba381d","Type":"ContainerStarted","Data":"681a01291e1a3f554563263ec1e78539083e23d090479d610b4e5e03d11a6ff4"} Mar 21 05:11:04 crc kubenswrapper[4580]: I0321 05:11:04.137094 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=9.875233352 podStartE2EDuration="33.137069484s" podCreationTimestamp="2026-03-21 05:10:31 +0000 UTC" firstStartedPulling="2026-03-21 05:10:33.856317822 +0000 UTC m=+1138.938901450" lastFinishedPulling="2026-03-21 05:10:57.118153954 +0000 UTC m=+1162.200737582" observedRunningTime="2026-03-21 05:11:04.131699841 +0000 UTC m=+1169.214283469" watchObservedRunningTime="2026-03-21 05:11:04.137069484 +0000 UTC m=+1169.219653112" Mar 21 05:11:04 crc kubenswrapper[4580]: E0321 05:11:04.781361 4580 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.138:58404->38.102.83.138:40205: read tcp 38.102.83.138:58404->38.102.83.138:40205: read: connection reset by peer Mar 21 05:11:05 crc kubenswrapper[4580]: I0321 05:11:05.128231 4580 generic.go:334] "Generic (PLEG): container finished" podID="893ab010-283a-4331-834a-05586719a352" containerID="8ef1470c2f6fd4575e80eef257f09e9a27ec184c5564839d02190bcebfe4d887" exitCode=0 Mar 21 05:11:05 crc kubenswrapper[4580]: I0321 05:11:05.128533 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tqfdg" event={"ID":"893ab010-283a-4331-834a-05586719a352","Type":"ContainerDied","Data":"8ef1470c2f6fd4575e80eef257f09e9a27ec184c5564839d02190bcebfe4d887"} Mar 21 05:11:05 crc kubenswrapper[4580]: I0321 05:11:05.133868 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jjv5q" event={"ID":"15016044-062f-44bc-8278-97a43b709083","Type":"ContainerStarted","Data":"c9b24deb47064975e6ecf7a56dc3c30de1804e525f466d6b357be5b8c99e0f55"} Mar 21 05:11:05 crc kubenswrapper[4580]: I0321 05:11:05.133903 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-jjv5q" Mar 21 05:11:05 crc kubenswrapper[4580]: I0321 05:11:05.156340 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=25.337670094 podStartE2EDuration="31.156313815s" podCreationTimestamp="2026-03-21 05:10:34 +0000 UTC" firstStartedPulling="2026-03-21 05:10:57.812211615 +0000 UTC m=+1162.894795233" lastFinishedPulling="2026-03-21 05:11:03.630855326 +0000 UTC m=+1168.713438954" observedRunningTime="2026-03-21 05:11:04.172041305 +0000 UTC m=+1169.254624943" watchObservedRunningTime="2026-03-21 05:11:05.156313815 +0000 UTC m=+1170.238897443" Mar 21 05:11:05 crc kubenswrapper[4580]: I0321 05:11:05.672686 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-jjv5q" podStartSLOduration=22.771371024 podStartE2EDuration="28.672669715s" podCreationTimestamp="2026-03-21 05:10:37 +0000 UTC" firstStartedPulling="2026-03-21 05:10:57.744532013 +0000 UTC m=+1162.827115641" lastFinishedPulling="2026-03-21 05:11:03.645830704 +0000 UTC m=+1168.728414332" observedRunningTime="2026-03-21 05:11:05.175566158 +0000 UTC m=+1170.258149806" watchObservedRunningTime="2026-03-21 05:11:05.672669715 +0000 UTC m=+1170.755253343" Mar 21 05:11:06 crc kubenswrapper[4580]: I0321 05:11:06.150042 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tqfdg" event={"ID":"893ab010-283a-4331-834a-05586719a352","Type":"ContainerStarted","Data":"292899932b975cd02b55cdc406f8a50bba4e314df5d48e9b1bcbba4cc2fe718e"} Mar 21 05:11:06 crc kubenswrapper[4580]: I0321 05:11:06.150348 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tqfdg" event={"ID":"893ab010-283a-4331-834a-05586719a352","Type":"ContainerStarted","Data":"eae140b3130762da022de1889f66a5b2b076e5451e8d8c18e5812147ce56aec9"} Mar 21 05:11:06 crc kubenswrapper[4580]: I0321 05:11:06.150921 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:11:06 crc kubenswrapper[4580]: I0321 05:11:06.150956 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:11:06 crc kubenswrapper[4580]: I0321 05:11:06.172645 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tqfdg" podStartSLOduration=25.003996415 podStartE2EDuration="29.172626798s" podCreationTimestamp="2026-03-21 05:10:37 +0000 UTC" firstStartedPulling="2026-03-21 05:10:59.47527242 +0000 UTC m=+1164.557856048" lastFinishedPulling="2026-03-21 05:11:03.643902803 +0000 UTC m=+1168.726486431" observedRunningTime="2026-03-21 05:11:06.166974287 +0000 UTC m=+1171.249557935" watchObservedRunningTime="2026-03-21 05:11:06.172626798 +0000 UTC m=+1171.255210426" Mar 21 05:11:09 crc kubenswrapper[4580]: I0321 05:11:09.177719 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"358c9476-8608-43e6-9912-6be4fb3f2ba8","Type":"ContainerStarted","Data":"f8edfa0d77d952d19b9d1f95664201167a6becb57fd01cc5ddf6d94e771567e9"} Mar 21 05:11:09 crc kubenswrapper[4580]: I0321 05:11:09.179796 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"514b5967-88ad-43e2-aa38-88551fba381d","Type":"ContainerStarted","Data":"61b9b47f36dace9c3129edab45062990271b2899025b09e52bcfc7b1daeab07d"} Mar 21 05:11:09 crc kubenswrapper[4580]: I0321 05:11:09.203438 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 21 05:11:09 crc kubenswrapper[4580]: I0321 05:11:09.209924 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=22.657999626 podStartE2EDuration="32.209904125s" podCreationTimestamp="2026-03-21 05:10:37 +0000 UTC" firstStartedPulling="2026-03-21 05:10:58.572190052 +0000 UTC m=+1163.654773680" lastFinishedPulling="2026-03-21 05:11:08.124094551 +0000 UTC m=+1173.206678179" observedRunningTime="2026-03-21 05:11:09.204311196 +0000 UTC m=+1174.286894834" watchObservedRunningTime="2026-03-21 05:11:09.209904125 +0000 UTC m=+1174.292487753" Mar 21 05:11:09 crc kubenswrapper[4580]: I0321 05:11:09.224864 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=20.270194498 podStartE2EDuration="29.224841953s" podCreationTimestamp="2026-03-21 05:10:40 +0000 UTC" firstStartedPulling="2026-03-21 05:10:59.153220424 +0000 UTC m=+1164.235804052" lastFinishedPulling="2026-03-21 05:11:08.107867879 +0000 UTC m=+1173.190451507" observedRunningTime="2026-03-21 05:11:09.21912225 +0000 UTC m=+1174.301705888" watchObservedRunningTime="2026-03-21 05:11:09.224841953 +0000 UTC m=+1174.307425591" Mar 21 05:11:09 crc kubenswrapper[4580]: I0321 05:11:09.262865 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.187645 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.228989 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.482373 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-69g76"] Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.528645 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-thpf8"] Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.530085 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.534279 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.539193 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-thpf8"] Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.611517 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-vpk6m"] Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.612811 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.616410 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.640903 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vpk6m"] Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.684321 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-ovn-rundir\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.684368 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-combined-ca-bundle\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.684390 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-thpf8\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.684448 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-thpf8\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.684475 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pnkt\" (UniqueName: \"kubernetes.io/projected/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-kube-api-access-8pnkt\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.684489 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx6pj\" (UniqueName: \"kubernetes.io/projected/023df01f-94c7-41c1-a8dc-7174b60ab172-kube-api-access-jx6pj\") pod \"dnsmasq-dns-7f896c8c65-thpf8\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.685582 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-config\") pod \"dnsmasq-dns-7f896c8c65-thpf8\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.685639 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.685663 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-config\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.685685 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-ovs-rundir\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.787596 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.787951 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-config\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.787985 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-ovs-rundir\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.788035 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-ovn-rundir\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.788068 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-combined-ca-bundle\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.788092 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-thpf8\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.788139 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-thpf8\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.788171 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pnkt\" (UniqueName: \"kubernetes.io/projected/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-kube-api-access-8pnkt\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.788191 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx6pj\" (UniqueName: \"kubernetes.io/projected/023df01f-94c7-41c1-a8dc-7174b60ab172-kube-api-access-jx6pj\") pod \"dnsmasq-dns-7f896c8c65-thpf8\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.788268 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-config\") pod \"dnsmasq-dns-7f896c8c65-thpf8\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.789223 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-config\") pod \"dnsmasq-dns-7f896c8c65-thpf8\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.789856 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-thpf8\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.790484 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-thpf8\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.791136 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-ovs-rundir\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.791544 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-ovn-rundir\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.791762 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-config\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.795804 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.798329 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-combined-ca-bundle\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.810439 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pnkt\" (UniqueName: \"kubernetes.io/projected/cc8eda41-b1d2-4f48-ac6e-59b7856a0917-kube-api-access-8pnkt\") pod \"ovn-controller-metrics-vpk6m\" (UID: \"cc8eda41-b1d2-4f48-ac6e-59b7856a0917\") " pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.816583 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx6pj\" (UniqueName: \"kubernetes.io/projected/023df01f-94c7-41c1-a8dc-7174b60ab172-kube-api-access-jx6pj\") pod \"dnsmasq-dns-7f896c8c65-thpf8\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.857226 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.896490 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-69g76" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.938439 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vpk6m" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.994007 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtg5r\" (UniqueName: \"kubernetes.io/projected/41d2dc91-5cd7-44df-94cc-7a4f63a14193-kube-api-access-rtg5r\") pod \"41d2dc91-5cd7-44df-94cc-7a4f63a14193\" (UID: \"41d2dc91-5cd7-44df-94cc-7a4f63a14193\") " Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.994220 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41d2dc91-5cd7-44df-94cc-7a4f63a14193-dns-svc\") pod \"41d2dc91-5cd7-44df-94cc-7a4f63a14193\" (UID: \"41d2dc91-5cd7-44df-94cc-7a4f63a14193\") " Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.995074 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-df9h2"] Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.995431 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41d2dc91-5cd7-44df-94cc-7a4f63a14193-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "41d2dc91-5cd7-44df-94cc-7a4f63a14193" (UID: "41d2dc91-5cd7-44df-94cc-7a4f63a14193"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.995681 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41d2dc91-5cd7-44df-94cc-7a4f63a14193-config\") pod \"41d2dc91-5cd7-44df-94cc-7a4f63a14193\" (UID: \"41d2dc91-5cd7-44df-94cc-7a4f63a14193\") " Mar 21 05:11:10 crc kubenswrapper[4580]: I0321 05:11:10.996360 4580 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41d2dc91-5cd7-44df-94cc-7a4f63a14193-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.003497 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41d2dc91-5cd7-44df-94cc-7a4f63a14193-config" (OuterVolumeSpecName: "config") pod "41d2dc91-5cd7-44df-94cc-7a4f63a14193" (UID: "41d2dc91-5cd7-44df-94cc-7a4f63a14193"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.015654 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d2dc91-5cd7-44df-94cc-7a4f63a14193-kube-api-access-rtg5r" (OuterVolumeSpecName: "kube-api-access-rtg5r") pod "41d2dc91-5cd7-44df-94cc-7a4f63a14193" (UID: "41d2dc91-5cd7-44df-94cc-7a4f63a14193"). InnerVolumeSpecName "kube-api-access-rtg5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.022141 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-ftzww"] Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.033084 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.036482 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.109205 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsscx\" (UniqueName: \"kubernetes.io/projected/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-kube-api-access-lsscx\") pod \"dnsmasq-dns-86db49b7ff-ftzww\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.109495 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-ftzww\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.109523 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-ftzww\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.109549 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-ftzww\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.109618 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-config\") pod \"dnsmasq-dns-86db49b7ff-ftzww\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.109629 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-ftzww"] Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.109681 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtg5r\" (UniqueName: \"kubernetes.io/projected/41d2dc91-5cd7-44df-94cc-7a4f63a14193-kube-api-access-rtg5r\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.109699 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41d2dc91-5cd7-44df-94cc-7a4f63a14193-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.211203 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-ftzww\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.212498 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-ftzww\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.212538 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-ftzww\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.212571 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-ftzww\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.212670 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-config\") pod \"dnsmasq-dns-86db49b7ff-ftzww\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.212770 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-ftzww\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.212813 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsscx\" (UniqueName: \"kubernetes.io/projected/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-kube-api-access-lsscx\") pod \"dnsmasq-dns-86db49b7ff-ftzww\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.213741 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-ftzww\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.214070 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-config\") pod \"dnsmasq-dns-86db49b7ff-ftzww\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.220084 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"226921bf-412a-4dc6-a722-3fcf5ecc7fdc","Type":"ContainerStarted","Data":"d8959c7042378870cf2e94ef305b8cba71a4d3e67b558c2195fe8b49f66c4be2"} Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.220402 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.222888 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-69g76" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.223225 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-69g76" event={"ID":"41d2dc91-5cd7-44df-94cc-7a4f63a14193","Type":"ContainerDied","Data":"de6064ae6aff0c3ea45b4d7303b7730f8f8cb651d7a3366590d4bf4c00b62d6f"} Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.229215 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsscx\" (UniqueName: \"kubernetes.io/projected/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-kube-api-access-lsscx\") pod \"dnsmasq-dns-86db49b7ff-ftzww\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.247048 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.111054363 podStartE2EDuration="39.247024369s" podCreationTimestamp="2026-03-21 05:10:32 +0000 UTC" firstStartedPulling="2026-03-21 05:10:33.923154722 +0000 UTC m=+1139.005738350" lastFinishedPulling="2026-03-21 05:11:10.059124728 +0000 UTC m=+1175.141708356" observedRunningTime="2026-03-21 05:11:11.243773873 +0000 UTC m=+1176.326357511" watchObservedRunningTime="2026-03-21 05:11:11.247024369 +0000 UTC m=+1176.329607997" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.316918 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-69g76"] Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.322479 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-69g76"] Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.362245 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.585704 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vpk6m"] Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.597578 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-thpf8"] Mar 21 05:11:11 crc kubenswrapper[4580]: W0321 05:11:11.602658 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod023df01f_94c7_41c1_a8dc_7174b60ab172.slice/crio-fbde2ae5810b9817490c4cfff163af9896781138b4ca55ddaf7e2b2346415189 WatchSource:0}: Error finding container fbde2ae5810b9817490c4cfff163af9896781138b4ca55ddaf7e2b2346415189: Status 404 returned error can't find the container with id fbde2ae5810b9817490c4cfff163af9896781138b4ca55ddaf7e2b2346415189 Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.615117 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.633870 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d2dc91-5cd7-44df-94cc-7a4f63a14193" path="/var/lib/kubelet/pods/41d2dc91-5cd7-44df-94cc-7a4f63a14193/volumes" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.726536 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-dns-svc\") pod \"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd\" (UID: \"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd\") " Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.726607 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh4gs\" (UniqueName: \"kubernetes.io/projected/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-kube-api-access-sh4gs\") pod \"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd\" (UID: \"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd\") " Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.726711 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-config\") pod \"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd\" (UID: \"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd\") " Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.727555 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "58549e5a-e3fb-4f2e-ac39-999a8dfb3efd" (UID: "58549e5a-e3fb-4f2e-ac39-999a8dfb3efd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.727825 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-config" (OuterVolumeSpecName: "config") pod "58549e5a-e3fb-4f2e-ac39-999a8dfb3efd" (UID: "58549e5a-e3fb-4f2e-ac39-999a8dfb3efd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.730370 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-kube-api-access-sh4gs" (OuterVolumeSpecName: "kube-api-access-sh4gs") pod "58549e5a-e3fb-4f2e-ac39-999a8dfb3efd" (UID: "58549e5a-e3fb-4f2e-ac39-999a8dfb3efd"). InnerVolumeSpecName "kube-api-access-sh4gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.828608 4580 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.828642 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh4gs\" (UniqueName: \"kubernetes.io/projected/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-kube-api-access-sh4gs\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.828653 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:11 crc kubenswrapper[4580]: I0321 05:11:11.939382 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-ftzww"] Mar 21 05:11:11 crc kubenswrapper[4580]: W0321 05:11:11.946569 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf6e0a0b_20cd_4471_ada8_85566de1d2f9.slice/crio-9679be4b14716999a2c37f0b0c7cc3ceb34db23aa17a9ca73d6a42c9b5c64be2 WatchSource:0}: Error finding container 9679be4b14716999a2c37f0b0c7cc3ceb34db23aa17a9ca73d6a42c9b5c64be2: Status 404 returned error can't find the container with id 9679be4b14716999a2c37f0b0c7cc3ceb34db23aa17a9ca73d6a42c9b5c64be2 Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.167386 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.201560 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.263148 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b4f4841a-f9ee-4d9d-b756-77cabd20363a","Type":"ContainerStarted","Data":"9ca17fe2cf0b3174b65020b531387e2e6024c76b94e4ed66b3de838e32553e0c"} Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.264130 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" event={"ID":"58549e5a-e3fb-4f2e-ac39-999a8dfb3efd","Type":"ContainerDied","Data":"6bc21e09904c5a22e8b34e61cd10d0dabaea06099abd6c40a05fa4a1e99e497e"} Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.264297 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-df9h2" Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.272397 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vpk6m" event={"ID":"cc8eda41-b1d2-4f48-ac6e-59b7856a0917","Type":"ContainerStarted","Data":"cd2e2268fbc920e0da502ad6cfd439cf504728309bb5ad16e1fec580f124d30b"} Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.272447 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vpk6m" event={"ID":"cc8eda41-b1d2-4f48-ac6e-59b7856a0917","Type":"ContainerStarted","Data":"ec18c4f620dc7a08c57af544adc35ffdf91b8ba9d4354f475117558b604359bd"} Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.278644 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" event={"ID":"023df01f-94c7-41c1-a8dc-7174b60ab172","Type":"ContainerStarted","Data":"fbde2ae5810b9817490c4cfff163af9896781138b4ca55ddaf7e2b2346415189"} Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.293945 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" event={"ID":"cf6e0a0b-20cd-4471-ada8-85566de1d2f9","Type":"ContainerStarted","Data":"9679be4b14716999a2c37f0b0c7cc3ceb34db23aa17a9ca73d6a42c9b5c64be2"} Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.294504 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.340498 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-vpk6m" podStartSLOduration=2.340480456 podStartE2EDuration="2.340480456s" podCreationTimestamp="2026-03-21 05:11:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:11:12.311104654 +0000 UTC m=+1177.393688302" watchObservedRunningTime="2026-03-21 05:11:12.340480456 +0000 UTC m=+1177.423064084" Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.365850 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-df9h2"] Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.375580 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.396024 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-df9h2"] Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.957730 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 21 05:11:12 crc kubenswrapper[4580]: I0321 05:11:12.957772 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 21 05:11:13 crc kubenswrapper[4580]: I0321 05:11:13.626506 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58549e5a-e3fb-4f2e-ac39-999a8dfb3efd" path="/var/lib/kubelet/pods/58549e5a-e3fb-4f2e-ac39-999a8dfb3efd/volumes" Mar 21 05:11:15 crc kubenswrapper[4580]: I0321 05:11:15.350340 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:15.948082 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:15.948137 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:15.948183 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:15.948846 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ce83f011c377b22dc6fc9c4fe068d2bf2cb580d09b97baaf4fd92fe417cd5eb"} pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:15.948905 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" containerID="cri-o://3ce83f011c377b22dc6fc9c4fe068d2bf2cb580d09b97baaf4fd92fe417cd5eb" gracePeriod=600 Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.739334 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.741117 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.759011 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-tdlzx" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.760105 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.760160 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.760179 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.770103 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.785862 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.837395 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69db8b67-aa51-41d9-8088-dba10b9bdd0d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.837460 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/69db8b67-aa51-41d9-8088-dba10b9bdd0d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.837489 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69db8b67-aa51-41d9-8088-dba10b9bdd0d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.837534 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzr84\" (UniqueName: \"kubernetes.io/projected/69db8b67-aa51-41d9-8088-dba10b9bdd0d-kube-api-access-zzr84\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.837553 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69db8b67-aa51-41d9-8088-dba10b9bdd0d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.837571 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69db8b67-aa51-41d9-8088-dba10b9bdd0d-config\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.838638 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69db8b67-aa51-41d9-8088-dba10b9bdd0d-scripts\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.883692 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.939907 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69db8b67-aa51-41d9-8088-dba10b9bdd0d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.939972 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/69db8b67-aa51-41d9-8088-dba10b9bdd0d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.940017 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69db8b67-aa51-41d9-8088-dba10b9bdd0d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.940097 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzr84\" (UniqueName: \"kubernetes.io/projected/69db8b67-aa51-41d9-8088-dba10b9bdd0d-kube-api-access-zzr84\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.940114 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69db8b67-aa51-41d9-8088-dba10b9bdd0d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.940135 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69db8b67-aa51-41d9-8088-dba10b9bdd0d-config\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.940164 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69db8b67-aa51-41d9-8088-dba10b9bdd0d-scripts\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.941919 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69db8b67-aa51-41d9-8088-dba10b9bdd0d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.943059 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69db8b67-aa51-41d9-8088-dba10b9bdd0d-config\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.943816 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69db8b67-aa51-41d9-8088-dba10b9bdd0d-scripts\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.947547 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69db8b67-aa51-41d9-8088-dba10b9bdd0d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.952319 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69db8b67-aa51-41d9-8088-dba10b9bdd0d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.958008 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/69db8b67-aa51-41d9-8088-dba10b9bdd0d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:17.961103 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:18.023012 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzr84\" (UniqueName: \"kubernetes.io/projected/69db8b67-aa51-41d9-8088-dba10b9bdd0d-kube-api-access-zzr84\") pod \"ovn-northd-0\" (UID: \"69db8b67-aa51-41d9-8088-dba10b9bdd0d\") " pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:18.060298 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:18.373061 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerID="3ce83f011c377b22dc6fc9c4fe068d2bf2cb580d09b97baaf4fd92fe417cd5eb" exitCode=0 Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:18.373136 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerDied","Data":"3ce83f011c377b22dc6fc9c4fe068d2bf2cb580d09b97baaf4fd92fe417cd5eb"} Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:18.373417 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"0008875a2f7ef6e2119165dc1e0e253e98f01735aec210fb18c6ffa1eebbb281"} Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:18.373440 4580 scope.go:117] "RemoveContainer" containerID="76a03eb87bee439fb7189493fe11b7778fb36a6c538f9c47967069b07415ab8b" Mar 21 05:11:18 crc kubenswrapper[4580]: I0321 05:11:18.628299 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 21 05:11:18 crc kubenswrapper[4580]: W0321 05:11:18.644140 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69db8b67_aa51_41d9_8088_dba10b9bdd0d.slice/crio-d8e4770b99ebd59b599bf7f4c86850854d39a9123bc15f4ecdc9e07a98ab21a2 WatchSource:0}: Error finding container d8e4770b99ebd59b599bf7f4c86850854d39a9123bc15f4ecdc9e07a98ab21a2: Status 404 returned error can't find the container with id d8e4770b99ebd59b599bf7f4c86850854d39a9123bc15f4ecdc9e07a98ab21a2 Mar 21 05:11:19 crc kubenswrapper[4580]: I0321 05:11:19.388657 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"69db8b67-aa51-41d9-8088-dba10b9bdd0d","Type":"ContainerStarted","Data":"d8e4770b99ebd59b599bf7f4c86850854d39a9123bc15f4ecdc9e07a98ab21a2"} Mar 21 05:11:19 crc kubenswrapper[4580]: I0321 05:11:19.394021 4580 generic.go:334] "Generic (PLEG): container finished" podID="cf6e0a0b-20cd-4471-ada8-85566de1d2f9" containerID="497e850acbde60d4d897561c9ed640ae5d3b5d0a8a2b68d58e6f6920f7053c37" exitCode=0 Mar 21 05:11:19 crc kubenswrapper[4580]: I0321 05:11:19.394100 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" event={"ID":"cf6e0a0b-20cd-4471-ada8-85566de1d2f9","Type":"ContainerDied","Data":"497e850acbde60d4d897561c9ed640ae5d3b5d0a8a2b68d58e6f6920f7053c37"} Mar 21 05:11:19 crc kubenswrapper[4580]: I0321 05:11:19.398352 4580 generic.go:334] "Generic (PLEG): container finished" podID="023df01f-94c7-41c1-a8dc-7174b60ab172" containerID="4dc6c1668ac7086359dfa11a38926775e1bddf5e1f76286595c1bbd717155ffc" exitCode=0 Mar 21 05:11:19 crc kubenswrapper[4580]: I0321 05:11:19.398439 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" event={"ID":"023df01f-94c7-41c1-a8dc-7174b60ab172","Type":"ContainerDied","Data":"4dc6c1668ac7086359dfa11a38926775e1bddf5e1f76286595c1bbd717155ffc"} Mar 21 05:11:20 crc kubenswrapper[4580]: I0321 05:11:20.423509 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" event={"ID":"023df01f-94c7-41c1-a8dc-7174b60ab172","Type":"ContainerStarted","Data":"899a722ff7bc1906f3dbb227616b639406eac93ae85a74f618945cbd720784eb"} Mar 21 05:11:20 crc kubenswrapper[4580]: I0321 05:11:20.424419 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:20 crc kubenswrapper[4580]: I0321 05:11:20.430675 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" event={"ID":"cf6e0a0b-20cd-4471-ada8-85566de1d2f9","Type":"ContainerStarted","Data":"46a0fdd4a9fdfcfd14c76379e351c2b3c98de0425473d9deabd125aa9ebef93a"} Mar 21 05:11:20 crc kubenswrapper[4580]: I0321 05:11:20.430945 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:20 crc kubenswrapper[4580]: I0321 05:11:20.451480 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" podStartSLOduration=4.084623049 podStartE2EDuration="10.451458957s" podCreationTimestamp="2026-03-21 05:11:10 +0000 UTC" firstStartedPulling="2026-03-21 05:11:11.608202707 +0000 UTC m=+1176.690786335" lastFinishedPulling="2026-03-21 05:11:17.975038614 +0000 UTC m=+1183.057622243" observedRunningTime="2026-03-21 05:11:20.444971314 +0000 UTC m=+1185.527554962" watchObservedRunningTime="2026-03-21 05:11:20.451458957 +0000 UTC m=+1185.534042585" Mar 21 05:11:20 crc kubenswrapper[4580]: I0321 05:11:20.465484 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" podStartSLOduration=4.415638014 podStartE2EDuration="10.46546863s" podCreationTimestamp="2026-03-21 05:11:10 +0000 UTC" firstStartedPulling="2026-03-21 05:11:11.949639639 +0000 UTC m=+1177.032223257" lastFinishedPulling="2026-03-21 05:11:17.999470245 +0000 UTC m=+1183.082053873" observedRunningTime="2026-03-21 05:11:20.464911055 +0000 UTC m=+1185.547494693" watchObservedRunningTime="2026-03-21 05:11:20.46546863 +0000 UTC m=+1185.548052258" Mar 21 05:11:21 crc kubenswrapper[4580]: I0321 05:11:21.260010 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-n7bq7"] Mar 21 05:11:21 crc kubenswrapper[4580]: I0321 05:11:21.261501 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n7bq7" Mar 21 05:11:21 crc kubenswrapper[4580]: I0321 05:11:21.264175 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 21 05:11:21 crc kubenswrapper[4580]: I0321 05:11:21.279812 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-n7bq7"] Mar 21 05:11:21 crc kubenswrapper[4580]: I0321 05:11:21.329635 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6da794dd-3fdf-4146-a540-a13835a84b24-operator-scripts\") pod \"root-account-create-update-n7bq7\" (UID: \"6da794dd-3fdf-4146-a540-a13835a84b24\") " pod="openstack/root-account-create-update-n7bq7" Mar 21 05:11:21 crc kubenswrapper[4580]: I0321 05:11:21.329681 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4hx4\" (UniqueName: \"kubernetes.io/projected/6da794dd-3fdf-4146-a540-a13835a84b24-kube-api-access-r4hx4\") pod \"root-account-create-update-n7bq7\" (UID: \"6da794dd-3fdf-4146-a540-a13835a84b24\") " pod="openstack/root-account-create-update-n7bq7" Mar 21 05:11:21 crc kubenswrapper[4580]: I0321 05:11:21.433364 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6da794dd-3fdf-4146-a540-a13835a84b24-operator-scripts\") pod \"root-account-create-update-n7bq7\" (UID: \"6da794dd-3fdf-4146-a540-a13835a84b24\") " pod="openstack/root-account-create-update-n7bq7" Mar 21 05:11:21 crc kubenswrapper[4580]: I0321 05:11:21.433400 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4hx4\" (UniqueName: \"kubernetes.io/projected/6da794dd-3fdf-4146-a540-a13835a84b24-kube-api-access-r4hx4\") pod \"root-account-create-update-n7bq7\" (UID: \"6da794dd-3fdf-4146-a540-a13835a84b24\") " pod="openstack/root-account-create-update-n7bq7" Mar 21 05:11:21 crc kubenswrapper[4580]: I0321 05:11:21.434639 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6da794dd-3fdf-4146-a540-a13835a84b24-operator-scripts\") pod \"root-account-create-update-n7bq7\" (UID: \"6da794dd-3fdf-4146-a540-a13835a84b24\") " pod="openstack/root-account-create-update-n7bq7" Mar 21 05:11:21 crc kubenswrapper[4580]: I0321 05:11:21.441185 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"69db8b67-aa51-41d9-8088-dba10b9bdd0d","Type":"ContainerStarted","Data":"ba14d6272a310110d58c4bd7454a49d82cb3e9df47dc75c6cb9aec35d1ab4173"} Mar 21 05:11:21 crc kubenswrapper[4580]: I0321 05:11:21.441226 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"69db8b67-aa51-41d9-8088-dba10b9bdd0d","Type":"ContainerStarted","Data":"f44a9bbb97ab8f23358789c5dafc6be0c36f72aa2956a09bb4818477c44bae7e"} Mar 21 05:11:21 crc kubenswrapper[4580]: I0321 05:11:21.453423 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4hx4\" (UniqueName: \"kubernetes.io/projected/6da794dd-3fdf-4146-a540-a13835a84b24-kube-api-access-r4hx4\") pod \"root-account-create-update-n7bq7\" (UID: \"6da794dd-3fdf-4146-a540-a13835a84b24\") " pod="openstack/root-account-create-update-n7bq7" Mar 21 05:11:21 crc kubenswrapper[4580]: I0321 05:11:21.467465 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.974498986 podStartE2EDuration="4.467442301s" podCreationTimestamp="2026-03-21 05:11:17 +0000 UTC" firstStartedPulling="2026-03-21 05:11:18.646567395 +0000 UTC m=+1183.729151023" lastFinishedPulling="2026-03-21 05:11:20.13951071 +0000 UTC m=+1185.222094338" observedRunningTime="2026-03-21 05:11:21.457498956 +0000 UTC m=+1186.540082594" watchObservedRunningTime="2026-03-21 05:11:21.467442301 +0000 UTC m=+1186.550025929" Mar 21 05:11:21 crc kubenswrapper[4580]: I0321 05:11:21.583330 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n7bq7" Mar 21 05:11:22 crc kubenswrapper[4580]: I0321 05:11:22.120891 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-n7bq7"] Mar 21 05:11:22 crc kubenswrapper[4580]: I0321 05:11:22.457031 4580 generic.go:334] "Generic (PLEG): container finished" podID="b4f4841a-f9ee-4d9d-b756-77cabd20363a" containerID="9ca17fe2cf0b3174b65020b531387e2e6024c76b94e4ed66b3de838e32553e0c" exitCode=0 Mar 21 05:11:22 crc kubenswrapper[4580]: I0321 05:11:22.457148 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b4f4841a-f9ee-4d9d-b756-77cabd20363a","Type":"ContainerDied","Data":"9ca17fe2cf0b3174b65020b531387e2e6024c76b94e4ed66b3de838e32553e0c"} Mar 21 05:11:22 crc kubenswrapper[4580]: I0321 05:11:22.465515 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n7bq7" event={"ID":"6da794dd-3fdf-4146-a540-a13835a84b24","Type":"ContainerStarted","Data":"00e2791c81bdea6deaa93b6a032b985c2babac1e542835fab1777a5a7677ca19"} Mar 21 05:11:22 crc kubenswrapper[4580]: I0321 05:11:22.465578 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n7bq7" event={"ID":"6da794dd-3fdf-4146-a540-a13835a84b24","Type":"ContainerStarted","Data":"6ebff5b875e88b103fa0439babbeb38d7a4c8ca6969aebd4f4648f82f08abec9"} Mar 21 05:11:22 crc kubenswrapper[4580]: I0321 05:11:22.465621 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 21 05:11:22 crc kubenswrapper[4580]: I0321 05:11:22.521275 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-n7bq7" podStartSLOduration=1.5212533910000001 podStartE2EDuration="1.521253391s" podCreationTimestamp="2026-03-21 05:11:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:11:22.517133671 +0000 UTC m=+1187.599717299" watchObservedRunningTime="2026-03-21 05:11:22.521253391 +0000 UTC m=+1187.603837019" Mar 21 05:11:23 crc kubenswrapper[4580]: I0321 05:11:23.474833 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b4f4841a-f9ee-4d9d-b756-77cabd20363a","Type":"ContainerStarted","Data":"804b64580c8f26ca2dcea90065d06ad57122b67c05710952c58a0f93f38dd2d8"} Mar 21 05:11:23 crc kubenswrapper[4580]: I0321 05:11:23.476582 4580 generic.go:334] "Generic (PLEG): container finished" podID="6da794dd-3fdf-4146-a540-a13835a84b24" containerID="00e2791c81bdea6deaa93b6a032b985c2babac1e542835fab1777a5a7677ca19" exitCode=0 Mar 21 05:11:23 crc kubenswrapper[4580]: I0321 05:11:23.476627 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n7bq7" event={"ID":"6da794dd-3fdf-4146-a540-a13835a84b24","Type":"ContainerDied","Data":"00e2791c81bdea6deaa93b6a032b985c2babac1e542835fab1777a5a7677ca19"} Mar 21 05:11:23 crc kubenswrapper[4580]: I0321 05:11:23.502941 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371982.351858 podStartE2EDuration="54.502917901s" podCreationTimestamp="2026-03-21 05:10:29 +0000 UTC" firstStartedPulling="2026-03-21 05:10:31.968122492 +0000 UTC m=+1137.050706120" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:11:23.496419358 +0000 UTC m=+1188.579003006" watchObservedRunningTime="2026-03-21 05:11:23.502917901 +0000 UTC m=+1188.585501529" Mar 21 05:11:24 crc kubenswrapper[4580]: I0321 05:11:24.820913 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n7bq7" Mar 21 05:11:24 crc kubenswrapper[4580]: I0321 05:11:24.996998 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4hx4\" (UniqueName: \"kubernetes.io/projected/6da794dd-3fdf-4146-a540-a13835a84b24-kube-api-access-r4hx4\") pod \"6da794dd-3fdf-4146-a540-a13835a84b24\" (UID: \"6da794dd-3fdf-4146-a540-a13835a84b24\") " Mar 21 05:11:24 crc kubenswrapper[4580]: I0321 05:11:24.997213 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6da794dd-3fdf-4146-a540-a13835a84b24-operator-scripts\") pod \"6da794dd-3fdf-4146-a540-a13835a84b24\" (UID: \"6da794dd-3fdf-4146-a540-a13835a84b24\") " Mar 21 05:11:25 crc kubenswrapper[4580]: I0321 05:11:25.001827 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6da794dd-3fdf-4146-a540-a13835a84b24-kube-api-access-r4hx4" (OuterVolumeSpecName: "kube-api-access-r4hx4") pod "6da794dd-3fdf-4146-a540-a13835a84b24" (UID: "6da794dd-3fdf-4146-a540-a13835a84b24"). InnerVolumeSpecName "kube-api-access-r4hx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:25 crc kubenswrapper[4580]: I0321 05:11:25.009172 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6da794dd-3fdf-4146-a540-a13835a84b24-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6da794dd-3fdf-4146-a540-a13835a84b24" (UID: "6da794dd-3fdf-4146-a540-a13835a84b24"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:25 crc kubenswrapper[4580]: I0321 05:11:25.099348 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6da794dd-3fdf-4146-a540-a13835a84b24-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:25 crc kubenswrapper[4580]: I0321 05:11:25.099391 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4hx4\" (UniqueName: \"kubernetes.io/projected/6da794dd-3fdf-4146-a540-a13835a84b24-kube-api-access-r4hx4\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:25 crc kubenswrapper[4580]: I0321 05:11:25.489907 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n7bq7" event={"ID":"6da794dd-3fdf-4146-a540-a13835a84b24","Type":"ContainerDied","Data":"6ebff5b875e88b103fa0439babbeb38d7a4c8ca6969aebd4f4648f82f08abec9"} Mar 21 05:11:25 crc kubenswrapper[4580]: I0321 05:11:25.489952 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n7bq7" Mar 21 05:11:25 crc kubenswrapper[4580]: I0321 05:11:25.489964 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ebff5b875e88b103fa0439babbeb38d7a4c8ca6969aebd4f4648f82f08abec9" Mar 21 05:11:25 crc kubenswrapper[4580]: I0321 05:11:25.859957 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:26 crc kubenswrapper[4580]: I0321 05:11:26.364659 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:26 crc kubenswrapper[4580]: I0321 05:11:26.414622 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-thpf8"] Mar 21 05:11:26 crc kubenswrapper[4580]: I0321 05:11:26.495751 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" podUID="023df01f-94c7-41c1-a8dc-7174b60ab172" containerName="dnsmasq-dns" containerID="cri-o://899a722ff7bc1906f3dbb227616b639406eac93ae85a74f618945cbd720784eb" gracePeriod=10 Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.431924 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.509387 4580 generic.go:334] "Generic (PLEG): container finished" podID="023df01f-94c7-41c1-a8dc-7174b60ab172" containerID="899a722ff7bc1906f3dbb227616b639406eac93ae85a74f618945cbd720784eb" exitCode=0 Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.509461 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.509457 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" event={"ID":"023df01f-94c7-41c1-a8dc-7174b60ab172","Type":"ContainerDied","Data":"899a722ff7bc1906f3dbb227616b639406eac93ae85a74f618945cbd720784eb"} Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.509900 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-thpf8" event={"ID":"023df01f-94c7-41c1-a8dc-7174b60ab172","Type":"ContainerDied","Data":"fbde2ae5810b9817490c4cfff163af9896781138b4ca55ddaf7e2b2346415189"} Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.509925 4580 scope.go:117] "RemoveContainer" containerID="899a722ff7bc1906f3dbb227616b639406eac93ae85a74f618945cbd720784eb" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.538461 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-config\") pod \"023df01f-94c7-41c1-a8dc-7174b60ab172\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.538523 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx6pj\" (UniqueName: \"kubernetes.io/projected/023df01f-94c7-41c1-a8dc-7174b60ab172-kube-api-access-jx6pj\") pod \"023df01f-94c7-41c1-a8dc-7174b60ab172\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.538581 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-dns-svc\") pod \"023df01f-94c7-41c1-a8dc-7174b60ab172\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.538682 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-ovsdbserver-sb\") pod \"023df01f-94c7-41c1-a8dc-7174b60ab172\" (UID: \"023df01f-94c7-41c1-a8dc-7174b60ab172\") " Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.540742 4580 scope.go:117] "RemoveContainer" containerID="4dc6c1668ac7086359dfa11a38926775e1bddf5e1f76286595c1bbd717155ffc" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.558009 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/023df01f-94c7-41c1-a8dc-7174b60ab172-kube-api-access-jx6pj" (OuterVolumeSpecName: "kube-api-access-jx6pj") pod "023df01f-94c7-41c1-a8dc-7174b60ab172" (UID: "023df01f-94c7-41c1-a8dc-7174b60ab172"). InnerVolumeSpecName "kube-api-access-jx6pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.563989 4580 scope.go:117] "RemoveContainer" containerID="899a722ff7bc1906f3dbb227616b639406eac93ae85a74f618945cbd720784eb" Mar 21 05:11:27 crc kubenswrapper[4580]: E0321 05:11:27.564445 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"899a722ff7bc1906f3dbb227616b639406eac93ae85a74f618945cbd720784eb\": container with ID starting with 899a722ff7bc1906f3dbb227616b639406eac93ae85a74f618945cbd720784eb not found: ID does not exist" containerID="899a722ff7bc1906f3dbb227616b639406eac93ae85a74f618945cbd720784eb" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.564573 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"899a722ff7bc1906f3dbb227616b639406eac93ae85a74f618945cbd720784eb"} err="failed to get container status \"899a722ff7bc1906f3dbb227616b639406eac93ae85a74f618945cbd720784eb\": rpc error: code = NotFound desc = could not find container \"899a722ff7bc1906f3dbb227616b639406eac93ae85a74f618945cbd720784eb\": container with ID starting with 899a722ff7bc1906f3dbb227616b639406eac93ae85a74f618945cbd720784eb not found: ID does not exist" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.564671 4580 scope.go:117] "RemoveContainer" containerID="4dc6c1668ac7086359dfa11a38926775e1bddf5e1f76286595c1bbd717155ffc" Mar 21 05:11:27 crc kubenswrapper[4580]: E0321 05:11:27.565167 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc6c1668ac7086359dfa11a38926775e1bddf5e1f76286595c1bbd717155ffc\": container with ID starting with 4dc6c1668ac7086359dfa11a38926775e1bddf5e1f76286595c1bbd717155ffc not found: ID does not exist" containerID="4dc6c1668ac7086359dfa11a38926775e1bddf5e1f76286595c1bbd717155ffc" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.565256 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc6c1668ac7086359dfa11a38926775e1bddf5e1f76286595c1bbd717155ffc"} err="failed to get container status \"4dc6c1668ac7086359dfa11a38926775e1bddf5e1f76286595c1bbd717155ffc\": rpc error: code = NotFound desc = could not find container \"4dc6c1668ac7086359dfa11a38926775e1bddf5e1f76286595c1bbd717155ffc\": container with ID starting with 4dc6c1668ac7086359dfa11a38926775e1bddf5e1f76286595c1bbd717155ffc not found: ID does not exist" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.586475 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "023df01f-94c7-41c1-a8dc-7174b60ab172" (UID: "023df01f-94c7-41c1-a8dc-7174b60ab172"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.597583 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "023df01f-94c7-41c1-a8dc-7174b60ab172" (UID: "023df01f-94c7-41c1-a8dc-7174b60ab172"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.611379 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-config" (OuterVolumeSpecName: "config") pod "023df01f-94c7-41c1-a8dc-7174b60ab172" (UID: "023df01f-94c7-41c1-a8dc-7174b60ab172"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.640255 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.640291 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.640304 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx6pj\" (UniqueName: \"kubernetes.io/projected/023df01f-94c7-41c1-a8dc-7174b60ab172-kube-api-access-jx6pj\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.640315 4580 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/023df01f-94c7-41c1-a8dc-7174b60ab172-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.849268 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-thpf8"] Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.860156 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-thpf8"] Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.910565 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-mtsm2"] Mar 21 05:11:27 crc kubenswrapper[4580]: E0321 05:11:27.910998 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da794dd-3fdf-4146-a540-a13835a84b24" containerName="mariadb-account-create-update" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.911021 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da794dd-3fdf-4146-a540-a13835a84b24" containerName="mariadb-account-create-update" Mar 21 05:11:27 crc kubenswrapper[4580]: E0321 05:11:27.911058 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="023df01f-94c7-41c1-a8dc-7174b60ab172" containerName="dnsmasq-dns" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.911067 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="023df01f-94c7-41c1-a8dc-7174b60ab172" containerName="dnsmasq-dns" Mar 21 05:11:27 crc kubenswrapper[4580]: E0321 05:11:27.911126 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="023df01f-94c7-41c1-a8dc-7174b60ab172" containerName="init" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.911136 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="023df01f-94c7-41c1-a8dc-7174b60ab172" containerName="init" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.911362 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="023df01f-94c7-41c1-a8dc-7174b60ab172" containerName="dnsmasq-dns" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.911387 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="6da794dd-3fdf-4146-a540-a13835a84b24" containerName="mariadb-account-create-update" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.912389 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:27 crc kubenswrapper[4580]: I0321 05:11:27.933970 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mtsm2"] Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.064678 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78qs5\" (UniqueName: \"kubernetes.io/projected/1bcfc888-3413-4eb9-a887-ef250a15962a-kube-api-access-78qs5\") pod \"dnsmasq-dns-698758b865-mtsm2\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.064743 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-mtsm2\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.064806 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-dns-svc\") pod \"dnsmasq-dns-698758b865-mtsm2\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.064875 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-config\") pod \"dnsmasq-dns-698758b865-mtsm2\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.064926 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-mtsm2\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.166016 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78qs5\" (UniqueName: \"kubernetes.io/projected/1bcfc888-3413-4eb9-a887-ef250a15962a-kube-api-access-78qs5\") pod \"dnsmasq-dns-698758b865-mtsm2\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.166074 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-mtsm2\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.166096 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-dns-svc\") pod \"dnsmasq-dns-698758b865-mtsm2\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.166145 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-config\") pod \"dnsmasq-dns-698758b865-mtsm2\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.166183 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-mtsm2\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.167261 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-mtsm2\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.167274 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-mtsm2\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.167318 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-dns-svc\") pod \"dnsmasq-dns-698758b865-mtsm2\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.167591 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-config\") pod \"dnsmasq-dns-698758b865-mtsm2\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.196915 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78qs5\" (UniqueName: \"kubernetes.io/projected/1bcfc888-3413-4eb9-a887-ef250a15962a-kube-api-access-78qs5\") pod \"dnsmasq-dns-698758b865-mtsm2\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.230927 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:28 crc kubenswrapper[4580]: I0321 05:11:28.721294 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mtsm2"] Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.079296 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.098895 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.103487 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-h56f6" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.103503 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.103559 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.103725 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.130424 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.284836 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d59ab798-9ae9-4f47-b58b-36417592eef2-lock\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.285336 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d59ab798-9ae9-4f47-b58b-36417592eef2-cache\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.285464 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.285696 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.285843 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59ab798-9ae9-4f47-b58b-36417592eef2-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.285964 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjqj6\" (UniqueName: \"kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-kube-api-access-sjqj6\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.388420 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.388677 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59ab798-9ae9-4f47-b58b-36417592eef2-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.388761 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjqj6\" (UniqueName: \"kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-kube-api-access-sjqj6\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.389458 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d59ab798-9ae9-4f47-b58b-36417592eef2-lock\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.389629 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d59ab798-9ae9-4f47-b58b-36417592eef2-cache\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.389787 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.390069 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d59ab798-9ae9-4f47-b58b-36417592eef2-lock\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: E0321 05:11:29.390085 4580 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.390122 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d59ab798-9ae9-4f47-b58b-36417592eef2-cache\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: E0321 05:11:29.390128 4580 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 05:11:29 crc kubenswrapper[4580]: E0321 05:11:29.390207 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift podName:d59ab798-9ae9-4f47-b58b-36417592eef2 nodeName:}" failed. No retries permitted until 2026-03-21 05:11:29.890188587 +0000 UTC m=+1194.972772215 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift") pod "swift-storage-0" (UID: "d59ab798-9ae9-4f47-b58b-36417592eef2") : configmap "swift-ring-files" not found Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.389010 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.397649 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59ab798-9ae9-4f47-b58b-36417592eef2-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.416834 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.418083 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjqj6\" (UniqueName: \"kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-kube-api-access-sjqj6\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.526684 4580 generic.go:334] "Generic (PLEG): container finished" podID="1bcfc888-3413-4eb9-a887-ef250a15962a" containerID="21567b14a9c45003f7fb9cd7948fd6e686a89c6c8ad543f92d66fa175f5d0053" exitCode=0 Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.526743 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mtsm2" event={"ID":"1bcfc888-3413-4eb9-a887-ef250a15962a","Type":"ContainerDied","Data":"21567b14a9c45003f7fb9cd7948fd6e686a89c6c8ad543f92d66fa175f5d0053"} Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.526808 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mtsm2" event={"ID":"1bcfc888-3413-4eb9-a887-ef250a15962a","Type":"ContainerStarted","Data":"6ef9b81fd82b8fb915899af4c63d13838d3d530eb4270aeb7294cdb86622d777"} Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.629926 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="023df01f-94c7-41c1-a8dc-7174b60ab172" path="/var/lib/kubelet/pods/023df01f-94c7-41c1-a8dc-7174b60ab172/volumes" Mar 21 05:11:29 crc kubenswrapper[4580]: I0321 05:11:29.897871 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:29 crc kubenswrapper[4580]: E0321 05:11:29.898009 4580 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 05:11:29 crc kubenswrapper[4580]: E0321 05:11:29.898036 4580 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 05:11:29 crc kubenswrapper[4580]: E0321 05:11:29.898105 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift podName:d59ab798-9ae9-4f47-b58b-36417592eef2 nodeName:}" failed. No retries permitted until 2026-03-21 05:11:30.898087062 +0000 UTC m=+1195.980670690 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift") pod "swift-storage-0" (UID: "d59ab798-9ae9-4f47-b58b-36417592eef2") : configmap "swift-ring-files" not found Mar 21 05:11:30 crc kubenswrapper[4580]: I0321 05:11:30.534064 4580 generic.go:334] "Generic (PLEG): container finished" podID="38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" containerID="4d38eef5a720f735d000140426f0b792c62476df0bed0a49cfb7c06e936f571f" exitCode=0 Mar 21 05:11:30 crc kubenswrapper[4580]: I0321 05:11:30.534121 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf","Type":"ContainerDied","Data":"4d38eef5a720f735d000140426f0b792c62476df0bed0a49cfb7c06e936f571f"} Mar 21 05:11:30 crc kubenswrapper[4580]: I0321 05:11:30.537598 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mtsm2" event={"ID":"1bcfc888-3413-4eb9-a887-ef250a15962a","Type":"ContainerStarted","Data":"d34a243adbe1462703a007a8dac8774f1d988e5dc334028b7c8492fa49467735"} Mar 21 05:11:30 crc kubenswrapper[4580]: I0321 05:11:30.538152 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:30 crc kubenswrapper[4580]: I0321 05:11:30.539598 4580 generic.go:334] "Generic (PLEG): container finished" podID="ac0ed353-d343-4f14-804b-affb2f0cc4d6" containerID="517c3f090ecfb64bc2b909bb5db8ba938766cb21ee2ca89f76f4bc37007b4577" exitCode=0 Mar 21 05:11:30 crc kubenswrapper[4580]: I0321 05:11:30.539626 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ac0ed353-d343-4f14-804b-affb2f0cc4d6","Type":"ContainerDied","Data":"517c3f090ecfb64bc2b909bb5db8ba938766cb21ee2ca89f76f4bc37007b4577"} Mar 21 05:11:30 crc kubenswrapper[4580]: I0321 05:11:30.599394 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-mtsm2" podStartSLOduration=3.599372556 podStartE2EDuration="3.599372556s" podCreationTimestamp="2026-03-21 05:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:11:30.584729006 +0000 UTC m=+1195.667312694" watchObservedRunningTime="2026-03-21 05:11:30.599372556 +0000 UTC m=+1195.681956184" Mar 21 05:11:30 crc kubenswrapper[4580]: I0321 05:11:30.913343 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:30 crc kubenswrapper[4580]: E0321 05:11:30.913495 4580 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 05:11:30 crc kubenswrapper[4580]: E0321 05:11:30.913825 4580 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 05:11:30 crc kubenswrapper[4580]: E0321 05:11:30.913885 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift podName:d59ab798-9ae9-4f47-b58b-36417592eef2 nodeName:}" failed. No retries permitted until 2026-03-21 05:11:32.91386699 +0000 UTC m=+1197.996450618 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift") pod "swift-storage-0" (UID: "d59ab798-9ae9-4f47-b58b-36417592eef2") : configmap "swift-ring-files" not found Mar 21 05:11:31 crc kubenswrapper[4580]: I0321 05:11:31.143310 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 21 05:11:31 crc kubenswrapper[4580]: I0321 05:11:31.144887 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 21 05:11:31 crc kubenswrapper[4580]: I0321 05:11:31.245323 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 21 05:11:31 crc kubenswrapper[4580]: I0321 05:11:31.562926 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf","Type":"ContainerStarted","Data":"87e95c71e5b4c69e0f416f42c75f627726c658bcb7ae591e1214f6008e48d864"} Mar 21 05:11:31 crc kubenswrapper[4580]: I0321 05:11:31.563256 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 21 05:11:31 crc kubenswrapper[4580]: I0321 05:11:31.567895 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ac0ed353-d343-4f14-804b-affb2f0cc4d6","Type":"ContainerStarted","Data":"86a71490e6445977417c6007cecc2a18237aebb0844bdc2aa10212685e5f69b1"} Mar 21 05:11:31 crc kubenswrapper[4580]: I0321 05:11:31.591158 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.797216869 podStartE2EDuration="1m3.591134595s" podCreationTimestamp="2026-03-21 05:10:28 +0000 UTC" firstStartedPulling="2026-03-21 05:10:31.252814626 +0000 UTC m=+1136.335398254" lastFinishedPulling="2026-03-21 05:10:57.046732342 +0000 UTC m=+1162.129315980" observedRunningTime="2026-03-21 05:11:31.585547856 +0000 UTC m=+1196.668131504" watchObservedRunningTime="2026-03-21 05:11:31.591134595 +0000 UTC m=+1196.673718223" Mar 21 05:11:31 crc kubenswrapper[4580]: I0321 05:11:31.614112 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.92713596 podStartE2EDuration="1m3.614080716s" podCreationTimestamp="2026-03-21 05:10:28 +0000 UTC" firstStartedPulling="2026-03-21 05:10:30.551528772 +0000 UTC m=+1135.634112400" lastFinishedPulling="2026-03-21 05:10:57.238473528 +0000 UTC m=+1162.321057156" observedRunningTime="2026-03-21 05:11:31.612492314 +0000 UTC m=+1196.695075952" watchObservedRunningTime="2026-03-21 05:11:31.614080716 +0000 UTC m=+1196.696664354" Mar 21 05:11:31 crc kubenswrapper[4580]: I0321 05:11:31.692624 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 21 05:11:32 crc kubenswrapper[4580]: I0321 05:11:32.955025 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:32 crc kubenswrapper[4580]: E0321 05:11:32.955250 4580 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 05:11:32 crc kubenswrapper[4580]: E0321 05:11:32.955264 4580 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 05:11:32 crc kubenswrapper[4580]: E0321 05:11:32.955304 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift podName:d59ab798-9ae9-4f47-b58b-36417592eef2 nodeName:}" failed. No retries permitted until 2026-03-21 05:11:36.95529062 +0000 UTC m=+1202.037874248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift") pod "swift-storage-0" (UID: "d59ab798-9ae9-4f47-b58b-36417592eef2") : configmap "swift-ring-files" not found Mar 21 05:11:32 crc kubenswrapper[4580]: I0321 05:11:32.979612 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-kkh4r"] Mar 21 05:11:32 crc kubenswrapper[4580]: I0321 05:11:32.980589 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:32 crc kubenswrapper[4580]: I0321 05:11:32.985552 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 21 05:11:32 crc kubenswrapper[4580]: I0321 05:11:32.985795 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 21 05:11:32 crc kubenswrapper[4580]: I0321 05:11:32.985997 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 21 05:11:32 crc kubenswrapper[4580]: I0321 05:11:32.993563 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kkh4r"] Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.056966 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3d23c194-d398-4264-8726-c75316c85eff-ring-data-devices\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.057012 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-swiftconf\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.057033 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d23c194-d398-4264-8726-c75316c85eff-scripts\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.057058 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3d23c194-d398-4264-8726-c75316c85eff-etc-swift\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.057165 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-dispersionconf\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.057200 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxpjh\" (UniqueName: \"kubernetes.io/projected/3d23c194-d398-4264-8726-c75316c85eff-kube-api-access-cxpjh\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.057327 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-combined-ca-bundle\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.158701 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-combined-ca-bundle\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.158763 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3d23c194-d398-4264-8726-c75316c85eff-ring-data-devices\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.158804 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-swiftconf\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.158831 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d23c194-d398-4264-8726-c75316c85eff-scripts\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.158897 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3d23c194-d398-4264-8726-c75316c85eff-etc-swift\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.159055 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-dispersionconf\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.159520 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3d23c194-d398-4264-8726-c75316c85eff-etc-swift\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.159587 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxpjh\" (UniqueName: \"kubernetes.io/projected/3d23c194-d398-4264-8726-c75316c85eff-kube-api-access-cxpjh\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.159683 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d23c194-d398-4264-8726-c75316c85eff-scripts\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.159926 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3d23c194-d398-4264-8726-c75316c85eff-ring-data-devices\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.164285 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-swiftconf\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.164691 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-combined-ca-bundle\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.169599 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-dispersionconf\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.239394 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxpjh\" (UniqueName: \"kubernetes.io/projected/3d23c194-d398-4264-8726-c75316c85eff-kube-api-access-cxpjh\") pod \"swift-ring-rebalance-kkh4r\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.297233 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:33 crc kubenswrapper[4580]: W0321 05:11:33.856937 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d23c194_d398_4264_8726_c75316c85eff.slice/crio-ebcebd15d2a6a05311db51a79f4975085eb6ba00ff8619c9ae1dcf69896f7594 WatchSource:0}: Error finding container ebcebd15d2a6a05311db51a79f4975085eb6ba00ff8619c9ae1dcf69896f7594: Status 404 returned error can't find the container with id ebcebd15d2a6a05311db51a79f4975085eb6ba00ff8619c9ae1dcf69896f7594 Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.862635 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kkh4r"] Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.994032 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-64qzs"] Mar 21 05:11:33 crc kubenswrapper[4580]: I0321 05:11:33.995073 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-64qzs" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.011464 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-64qzs"] Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.073468 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcmfq\" (UniqueName: \"kubernetes.io/projected/f34fd0f6-3c75-42ff-909d-32c6255e5c68-kube-api-access-rcmfq\") pod \"keystone-db-create-64qzs\" (UID: \"f34fd0f6-3c75-42ff-909d-32c6255e5c68\") " pod="openstack/keystone-db-create-64qzs" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.073613 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f34fd0f6-3c75-42ff-909d-32c6255e5c68-operator-scripts\") pod \"keystone-db-create-64qzs\" (UID: \"f34fd0f6-3c75-42ff-909d-32c6255e5c68\") " pod="openstack/keystone-db-create-64qzs" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.102519 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7442-account-create-update-fqvpj"] Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.103834 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7442-account-create-update-fqvpj" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.105975 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.122830 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7442-account-create-update-fqvpj"] Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.182061 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hjn2\" (UniqueName: \"kubernetes.io/projected/ca4f77fa-08f5-4c3c-9300-112440f9acc1-kube-api-access-7hjn2\") pod \"keystone-7442-account-create-update-fqvpj\" (UID: \"ca4f77fa-08f5-4c3c-9300-112440f9acc1\") " pod="openstack/keystone-7442-account-create-update-fqvpj" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.182182 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f34fd0f6-3c75-42ff-909d-32c6255e5c68-operator-scripts\") pod \"keystone-db-create-64qzs\" (UID: \"f34fd0f6-3c75-42ff-909d-32c6255e5c68\") " pod="openstack/keystone-db-create-64qzs" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.182238 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca4f77fa-08f5-4c3c-9300-112440f9acc1-operator-scripts\") pod \"keystone-7442-account-create-update-fqvpj\" (UID: \"ca4f77fa-08f5-4c3c-9300-112440f9acc1\") " pod="openstack/keystone-7442-account-create-update-fqvpj" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.182283 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcmfq\" (UniqueName: \"kubernetes.io/projected/f34fd0f6-3c75-42ff-909d-32c6255e5c68-kube-api-access-rcmfq\") pod \"keystone-db-create-64qzs\" (UID: \"f34fd0f6-3c75-42ff-909d-32c6255e5c68\") " pod="openstack/keystone-db-create-64qzs" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.183087 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f34fd0f6-3c75-42ff-909d-32c6255e5c68-operator-scripts\") pod \"keystone-db-create-64qzs\" (UID: \"f34fd0f6-3c75-42ff-909d-32c6255e5c68\") " pod="openstack/keystone-db-create-64qzs" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.212537 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcmfq\" (UniqueName: \"kubernetes.io/projected/f34fd0f6-3c75-42ff-909d-32c6255e5c68-kube-api-access-rcmfq\") pod \"keystone-db-create-64qzs\" (UID: \"f34fd0f6-3c75-42ff-909d-32c6255e5c68\") " pod="openstack/keystone-db-create-64qzs" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.227710 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6hzdk"] Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.229413 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6hzdk" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.244167 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6hzdk"] Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.289546 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6afa77b-c41a-42ad-a253-64cf7e6e5544-operator-scripts\") pod \"placement-db-create-6hzdk\" (UID: \"b6afa77b-c41a-42ad-a253-64cf7e6e5544\") " pod="openstack/placement-db-create-6hzdk" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.289651 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hjn2\" (UniqueName: \"kubernetes.io/projected/ca4f77fa-08f5-4c3c-9300-112440f9acc1-kube-api-access-7hjn2\") pod \"keystone-7442-account-create-update-fqvpj\" (UID: \"ca4f77fa-08f5-4c3c-9300-112440f9acc1\") " pod="openstack/keystone-7442-account-create-update-fqvpj" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.289730 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76g86\" (UniqueName: \"kubernetes.io/projected/b6afa77b-c41a-42ad-a253-64cf7e6e5544-kube-api-access-76g86\") pod \"placement-db-create-6hzdk\" (UID: \"b6afa77b-c41a-42ad-a253-64cf7e6e5544\") " pod="openstack/placement-db-create-6hzdk" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.289774 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca4f77fa-08f5-4c3c-9300-112440f9acc1-operator-scripts\") pod \"keystone-7442-account-create-update-fqvpj\" (UID: \"ca4f77fa-08f5-4c3c-9300-112440f9acc1\") " pod="openstack/keystone-7442-account-create-update-fqvpj" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.290465 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca4f77fa-08f5-4c3c-9300-112440f9acc1-operator-scripts\") pod \"keystone-7442-account-create-update-fqvpj\" (UID: \"ca4f77fa-08f5-4c3c-9300-112440f9acc1\") " pod="openstack/keystone-7442-account-create-update-fqvpj" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.314595 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-64qzs" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.363938 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hjn2\" (UniqueName: \"kubernetes.io/projected/ca4f77fa-08f5-4c3c-9300-112440f9acc1-kube-api-access-7hjn2\") pod \"keystone-7442-account-create-update-fqvpj\" (UID: \"ca4f77fa-08f5-4c3c-9300-112440f9acc1\") " pod="openstack/keystone-7442-account-create-update-fqvpj" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.390859 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-81c4-account-create-update-khzvr"] Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.391921 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-81c4-account-create-update-khzvr" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.393183 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76g86\" (UniqueName: \"kubernetes.io/projected/b6afa77b-c41a-42ad-a253-64cf7e6e5544-kube-api-access-76g86\") pod \"placement-db-create-6hzdk\" (UID: \"b6afa77b-c41a-42ad-a253-64cf7e6e5544\") " pod="openstack/placement-db-create-6hzdk" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.393248 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6afa77b-c41a-42ad-a253-64cf7e6e5544-operator-scripts\") pod \"placement-db-create-6hzdk\" (UID: \"b6afa77b-c41a-42ad-a253-64cf7e6e5544\") " pod="openstack/placement-db-create-6hzdk" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.394007 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6afa77b-c41a-42ad-a253-64cf7e6e5544-operator-scripts\") pod \"placement-db-create-6hzdk\" (UID: \"b6afa77b-c41a-42ad-a253-64cf7e6e5544\") " pod="openstack/placement-db-create-6hzdk" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.397981 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.408623 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-81c4-account-create-update-khzvr"] Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.417317 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76g86\" (UniqueName: \"kubernetes.io/projected/b6afa77b-c41a-42ad-a253-64cf7e6e5544-kube-api-access-76g86\") pod \"placement-db-create-6hzdk\" (UID: \"b6afa77b-c41a-42ad-a253-64cf7e6e5544\") " pod="openstack/placement-db-create-6hzdk" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.418635 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7442-account-create-update-fqvpj" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.495183 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf6c0d9f-d17b-458f-a653-713f834d70cc-operator-scripts\") pod \"placement-81c4-account-create-update-khzvr\" (UID: \"bf6c0d9f-d17b-458f-a653-713f834d70cc\") " pod="openstack/placement-81c4-account-create-update-khzvr" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.495334 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blw6k\" (UniqueName: \"kubernetes.io/projected/bf6c0d9f-d17b-458f-a653-713f834d70cc-kube-api-access-blw6k\") pod \"placement-81c4-account-create-update-khzvr\" (UID: \"bf6c0d9f-d17b-458f-a653-713f834d70cc\") " pod="openstack/placement-81c4-account-create-update-khzvr" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.582854 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6hzdk" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.594524 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kkh4r" event={"ID":"3d23c194-d398-4264-8726-c75316c85eff","Type":"ContainerStarted","Data":"ebcebd15d2a6a05311db51a79f4975085eb6ba00ff8619c9ae1dcf69896f7594"} Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.597826 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blw6k\" (UniqueName: \"kubernetes.io/projected/bf6c0d9f-d17b-458f-a653-713f834d70cc-kube-api-access-blw6k\") pod \"placement-81c4-account-create-update-khzvr\" (UID: \"bf6c0d9f-d17b-458f-a653-713f834d70cc\") " pod="openstack/placement-81c4-account-create-update-khzvr" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.597898 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf6c0d9f-d17b-458f-a653-713f834d70cc-operator-scripts\") pod \"placement-81c4-account-create-update-khzvr\" (UID: \"bf6c0d9f-d17b-458f-a653-713f834d70cc\") " pod="openstack/placement-81c4-account-create-update-khzvr" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.598614 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf6c0d9f-d17b-458f-a653-713f834d70cc-operator-scripts\") pod \"placement-81c4-account-create-update-khzvr\" (UID: \"bf6c0d9f-d17b-458f-a653-713f834d70cc\") " pod="openstack/placement-81c4-account-create-update-khzvr" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.651541 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blw6k\" (UniqueName: \"kubernetes.io/projected/bf6c0d9f-d17b-458f-a653-713f834d70cc-kube-api-access-blw6k\") pod \"placement-81c4-account-create-update-khzvr\" (UID: \"bf6c0d9f-d17b-458f-a653-713f834d70cc\") " pod="openstack/placement-81c4-account-create-update-khzvr" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.791258 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-81c4-account-create-update-khzvr" Mar 21 05:11:34 crc kubenswrapper[4580]: I0321 05:11:34.861020 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-64qzs"] Mar 21 05:11:35 crc kubenswrapper[4580]: I0321 05:11:35.143565 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7442-account-create-update-fqvpj"] Mar 21 05:11:35 crc kubenswrapper[4580]: W0321 05:11:35.210022 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca4f77fa_08f5_4c3c_9300_112440f9acc1.slice/crio-5a9601f5e6ddfb5a476f0149419d47f3c5ed704b9cb4538122d1eea3defd32a9 WatchSource:0}: Error finding container 5a9601f5e6ddfb5a476f0149419d47f3c5ed704b9cb4538122d1eea3defd32a9: Status 404 returned error can't find the container with id 5a9601f5e6ddfb5a476f0149419d47f3c5ed704b9cb4538122d1eea3defd32a9 Mar 21 05:11:35 crc kubenswrapper[4580]: I0321 05:11:35.252247 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6hzdk"] Mar 21 05:11:35 crc kubenswrapper[4580]: W0321 05:11:35.287593 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6afa77b_c41a_42ad_a253_64cf7e6e5544.slice/crio-0fcae70a1faf0f2f5d728a9a759f15acdda99e7698c833d5e00916419d9c699f WatchSource:0}: Error finding container 0fcae70a1faf0f2f5d728a9a759f15acdda99e7698c833d5e00916419d9c699f: Status 404 returned error can't find the container with id 0fcae70a1faf0f2f5d728a9a759f15acdda99e7698c833d5e00916419d9c699f Mar 21 05:11:35 crc kubenswrapper[4580]: I0321 05:11:35.667868 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-64qzs" event={"ID":"f34fd0f6-3c75-42ff-909d-32c6255e5c68","Type":"ContainerStarted","Data":"88709bbb3c283cf655abc2c25069420c06e5ed4452325b4f16aa2da2adc0772a"} Mar 21 05:11:35 crc kubenswrapper[4580]: I0321 05:11:35.668255 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-64qzs" event={"ID":"f34fd0f6-3c75-42ff-909d-32c6255e5c68","Type":"ContainerStarted","Data":"505f002dc9e2aafa3d514bf202eeba2bfa56722b73f335fcbddb11ff884bd786"} Mar 21 05:11:35 crc kubenswrapper[4580]: I0321 05:11:35.668269 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7442-account-create-update-fqvpj" event={"ID":"ca4f77fa-08f5-4c3c-9300-112440f9acc1","Type":"ContainerStarted","Data":"5a9601f5e6ddfb5a476f0149419d47f3c5ed704b9cb4538122d1eea3defd32a9"} Mar 21 05:11:35 crc kubenswrapper[4580]: I0321 05:11:35.668284 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6hzdk" event={"ID":"b6afa77b-c41a-42ad-a253-64cf7e6e5544","Type":"ContainerStarted","Data":"0fcae70a1faf0f2f5d728a9a759f15acdda99e7698c833d5e00916419d9c699f"} Mar 21 05:11:35 crc kubenswrapper[4580]: I0321 05:11:35.787853 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7442-account-create-update-fqvpj" podStartSLOduration=1.787833856 podStartE2EDuration="1.787833856s" podCreationTimestamp="2026-03-21 05:11:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:11:35.783175882 +0000 UTC m=+1200.865759510" watchObservedRunningTime="2026-03-21 05:11:35.787833856 +0000 UTC m=+1200.870417484" Mar 21 05:11:35 crc kubenswrapper[4580]: I0321 05:11:35.809005 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-6hzdk" podStartSLOduration=1.8089915589999999 podStartE2EDuration="1.808991559s" podCreationTimestamp="2026-03-21 05:11:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:11:35.804475159 +0000 UTC m=+1200.887058787" watchObservedRunningTime="2026-03-21 05:11:35.808991559 +0000 UTC m=+1200.891575187" Mar 21 05:11:35 crc kubenswrapper[4580]: I0321 05:11:35.834912 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-81c4-account-create-update-khzvr"] Mar 21 05:11:35 crc kubenswrapper[4580]: I0321 05:11:35.835416 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-64qzs" podStartSLOduration=2.835404232 podStartE2EDuration="2.835404232s" podCreationTimestamp="2026-03-21 05:11:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:11:35.825861928 +0000 UTC m=+1200.908445556" watchObservedRunningTime="2026-03-21 05:11:35.835404232 +0000 UTC m=+1200.917987860" Mar 21 05:11:35 crc kubenswrapper[4580]: I0321 05:11:35.866614 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 21 05:11:36 crc kubenswrapper[4580]: I0321 05:11:36.670248 4580 generic.go:334] "Generic (PLEG): container finished" podID="f34fd0f6-3c75-42ff-909d-32c6255e5c68" containerID="88709bbb3c283cf655abc2c25069420c06e5ed4452325b4f16aa2da2adc0772a" exitCode=0 Mar 21 05:11:36 crc kubenswrapper[4580]: I0321 05:11:36.670295 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-64qzs" event={"ID":"f34fd0f6-3c75-42ff-909d-32c6255e5c68","Type":"ContainerDied","Data":"88709bbb3c283cf655abc2c25069420c06e5ed4452325b4f16aa2da2adc0772a"} Mar 21 05:11:36 crc kubenswrapper[4580]: I0321 05:11:36.676738 4580 generic.go:334] "Generic (PLEG): container finished" podID="bf6c0d9f-d17b-458f-a653-713f834d70cc" containerID="35e740c8a272a667638641c3119b6bc3f021d8e408a408c6e63666b815dd286f" exitCode=0 Mar 21 05:11:36 crc kubenswrapper[4580]: I0321 05:11:36.676842 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-81c4-account-create-update-khzvr" event={"ID":"bf6c0d9f-d17b-458f-a653-713f834d70cc","Type":"ContainerDied","Data":"35e740c8a272a667638641c3119b6bc3f021d8e408a408c6e63666b815dd286f"} Mar 21 05:11:36 crc kubenswrapper[4580]: I0321 05:11:36.676876 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-81c4-account-create-update-khzvr" event={"ID":"bf6c0d9f-d17b-458f-a653-713f834d70cc","Type":"ContainerStarted","Data":"5ddf8a2b6396f10df91f4c8c3ed74e39b0cf9937ece1415cdc52d3e8993b9998"} Mar 21 05:11:36 crc kubenswrapper[4580]: I0321 05:11:36.679008 4580 generic.go:334] "Generic (PLEG): container finished" podID="ca4f77fa-08f5-4c3c-9300-112440f9acc1" containerID="4abb427e8ef3ed91ba0d42cfab260c30f4ef95eb6459fe530829f6d00e9b02fe" exitCode=0 Mar 21 05:11:36 crc kubenswrapper[4580]: I0321 05:11:36.679059 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7442-account-create-update-fqvpj" event={"ID":"ca4f77fa-08f5-4c3c-9300-112440f9acc1","Type":"ContainerDied","Data":"4abb427e8ef3ed91ba0d42cfab260c30f4ef95eb6459fe530829f6d00e9b02fe"} Mar 21 05:11:36 crc kubenswrapper[4580]: I0321 05:11:36.681226 4580 generic.go:334] "Generic (PLEG): container finished" podID="b6afa77b-c41a-42ad-a253-64cf7e6e5544" containerID="5b4ddf91f9e1f80a62c63f53c4c525fda532989ab0b4bc8ca18d83e29b62ab2b" exitCode=0 Mar 21 05:11:36 crc kubenswrapper[4580]: I0321 05:11:36.681264 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6hzdk" event={"ID":"b6afa77b-c41a-42ad-a253-64cf7e6e5544","Type":"ContainerDied","Data":"5b4ddf91f9e1f80a62c63f53c4c525fda532989ab0b4bc8ca18d83e29b62ab2b"} Mar 21 05:11:36 crc kubenswrapper[4580]: I0321 05:11:36.978278 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:36 crc kubenswrapper[4580]: E0321 05:11:36.978439 4580 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 05:11:36 crc kubenswrapper[4580]: E0321 05:11:36.978471 4580 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 05:11:36 crc kubenswrapper[4580]: E0321 05:11:36.978553 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift podName:d59ab798-9ae9-4f47-b58b-36417592eef2 nodeName:}" failed. No retries permitted until 2026-03-21 05:11:44.978534771 +0000 UTC m=+1210.061118399 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift") pod "swift-storage-0" (UID: "d59ab798-9ae9-4f47-b58b-36417592eef2") : configmap "swift-ring-files" not found Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.147902 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.211227 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-mwp8r"] Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.212570 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mwp8r" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.233430 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.250847 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mwp8r"] Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.307742 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0d1dcc7-e806-4348-94ab-347efc9930b7-operator-scripts\") pod \"glance-db-create-mwp8r\" (UID: \"d0d1dcc7-e806-4348-94ab-347efc9930b7\") " pod="openstack/glance-db-create-mwp8r" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.307887 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgsp6\" (UniqueName: \"kubernetes.io/projected/d0d1dcc7-e806-4348-94ab-347efc9930b7-kube-api-access-lgsp6\") pod \"glance-db-create-mwp8r\" (UID: \"d0d1dcc7-e806-4348-94ab-347efc9930b7\") " pod="openstack/glance-db-create-mwp8r" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.354774 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-ftzww"] Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.355146 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" podUID="cf6e0a0b-20cd-4471-ada8-85566de1d2f9" containerName="dnsmasq-dns" containerID="cri-o://46a0fdd4a9fdfcfd14c76379e351c2b3c98de0425473d9deabd125aa9ebef93a" gracePeriod=10 Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.410315 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0d1dcc7-e806-4348-94ab-347efc9930b7-operator-scripts\") pod \"glance-db-create-mwp8r\" (UID: \"d0d1dcc7-e806-4348-94ab-347efc9930b7\") " pod="openstack/glance-db-create-mwp8r" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.410453 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgsp6\" (UniqueName: \"kubernetes.io/projected/d0d1dcc7-e806-4348-94ab-347efc9930b7-kube-api-access-lgsp6\") pod \"glance-db-create-mwp8r\" (UID: \"d0d1dcc7-e806-4348-94ab-347efc9930b7\") " pod="openstack/glance-db-create-mwp8r" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.411889 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0d1dcc7-e806-4348-94ab-347efc9930b7-operator-scripts\") pod \"glance-db-create-mwp8r\" (UID: \"d0d1dcc7-e806-4348-94ab-347efc9930b7\") " pod="openstack/glance-db-create-mwp8r" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.455163 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8ced-account-create-update-rvjp7"] Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.456210 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8ced-account-create-update-rvjp7" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.460609 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.474508 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgsp6\" (UniqueName: \"kubernetes.io/projected/d0d1dcc7-e806-4348-94ab-347efc9930b7-kube-api-access-lgsp6\") pod \"glance-db-create-mwp8r\" (UID: \"d0d1dcc7-e806-4348-94ab-347efc9930b7\") " pod="openstack/glance-db-create-mwp8r" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.490384 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8ced-account-create-update-rvjp7"] Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.543127 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mwp8r" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.561254 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.603068 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tqfdg" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.603135 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-jjv5q" podUID="15016044-062f-44bc-8278-97a43b709083" containerName="ovn-controller" probeResult="failure" output=< Mar 21 05:11:38 crc kubenswrapper[4580]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 21 05:11:38 crc kubenswrapper[4580]: > Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.620190 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f2rk\" (UniqueName: \"kubernetes.io/projected/6da615d4-b81c-4ad0-90b2-23e4029c949c-kube-api-access-2f2rk\") pod \"glance-8ced-account-create-update-rvjp7\" (UID: \"6da615d4-b81c-4ad0-90b2-23e4029c949c\") " pod="openstack/glance-8ced-account-create-update-rvjp7" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.620246 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6da615d4-b81c-4ad0-90b2-23e4029c949c-operator-scripts\") pod \"glance-8ced-account-create-update-rvjp7\" (UID: \"6da615d4-b81c-4ad0-90b2-23e4029c949c\") " pod="openstack/glance-8ced-account-create-update-rvjp7" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.723539 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f2rk\" (UniqueName: \"kubernetes.io/projected/6da615d4-b81c-4ad0-90b2-23e4029c949c-kube-api-access-2f2rk\") pod \"glance-8ced-account-create-update-rvjp7\" (UID: \"6da615d4-b81c-4ad0-90b2-23e4029c949c\") " pod="openstack/glance-8ced-account-create-update-rvjp7" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.723601 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6da615d4-b81c-4ad0-90b2-23e4029c949c-operator-scripts\") pod \"glance-8ced-account-create-update-rvjp7\" (UID: \"6da615d4-b81c-4ad0-90b2-23e4029c949c\") " pod="openstack/glance-8ced-account-create-update-rvjp7" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.725130 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6da615d4-b81c-4ad0-90b2-23e4029c949c-operator-scripts\") pod \"glance-8ced-account-create-update-rvjp7\" (UID: \"6da615d4-b81c-4ad0-90b2-23e4029c949c\") " pod="openstack/glance-8ced-account-create-update-rvjp7" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.754196 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f2rk\" (UniqueName: \"kubernetes.io/projected/6da615d4-b81c-4ad0-90b2-23e4029c949c-kube-api-access-2f2rk\") pod \"glance-8ced-account-create-update-rvjp7\" (UID: \"6da615d4-b81c-4ad0-90b2-23e4029c949c\") " pod="openstack/glance-8ced-account-create-update-rvjp7" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.832107 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8ced-account-create-update-rvjp7" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.980344 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jjv5q-config-2fhqf"] Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.981443 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:38 crc kubenswrapper[4580]: I0321 05:11:38.984542 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.051710 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jjv5q-config-2fhqf"] Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.133704 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x76b7\" (UniqueName: \"kubernetes.io/projected/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-kube-api-access-x76b7\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.133819 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-log-ovn\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.133853 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-run\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.133890 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-scripts\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.133918 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-run-ovn\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.133934 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-additional-scripts\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.235964 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x76b7\" (UniqueName: \"kubernetes.io/projected/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-kube-api-access-x76b7\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.236045 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-log-ovn\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.236078 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-run\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.236112 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-scripts\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.236151 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-run-ovn\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.236172 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-additional-scripts\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.236541 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-run\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.236606 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-log-ovn\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.236935 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-additional-scripts\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.236997 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-run-ovn\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.240703 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-scripts\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.286505 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x76b7\" (UniqueName: \"kubernetes.io/projected/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-kube-api-access-x76b7\") pod \"ovn-controller-jjv5q-config-2fhqf\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.301366 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.705850 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-n7bq7"] Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.712712 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-n7bq7"] Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.730127 4580 generic.go:334] "Generic (PLEG): container finished" podID="cf6e0a0b-20cd-4471-ada8-85566de1d2f9" containerID="46a0fdd4a9fdfcfd14c76379e351c2b3c98de0425473d9deabd125aa9ebef93a" exitCode=0 Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.730168 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" event={"ID":"cf6e0a0b-20cd-4471-ada8-85566de1d2f9","Type":"ContainerDied","Data":"46a0fdd4a9fdfcfd14c76379e351c2b3c98de0425473d9deabd125aa9ebef93a"} Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.801853 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.810202 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-hnpbd"] Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.812554 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hnpbd" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.815730 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.819399 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hnpbd"] Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.947166 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fbe609f-6d91-4f8b-82b9-17a602597351-operator-scripts\") pod \"root-account-create-update-hnpbd\" (UID: \"5fbe609f-6d91-4f8b-82b9-17a602597351\") " pod="openstack/root-account-create-update-hnpbd" Mar 21 05:11:39 crc kubenswrapper[4580]: I0321 05:11:39.947222 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr9pc\" (UniqueName: \"kubernetes.io/projected/5fbe609f-6d91-4f8b-82b9-17a602597351-kube-api-access-kr9pc\") pod \"root-account-create-update-hnpbd\" (UID: \"5fbe609f-6d91-4f8b-82b9-17a602597351\") " pod="openstack/root-account-create-update-hnpbd" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.049240 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fbe609f-6d91-4f8b-82b9-17a602597351-operator-scripts\") pod \"root-account-create-update-hnpbd\" (UID: \"5fbe609f-6d91-4f8b-82b9-17a602597351\") " pod="openstack/root-account-create-update-hnpbd" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.049294 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr9pc\" (UniqueName: \"kubernetes.io/projected/5fbe609f-6d91-4f8b-82b9-17a602597351-kube-api-access-kr9pc\") pod \"root-account-create-update-hnpbd\" (UID: \"5fbe609f-6d91-4f8b-82b9-17a602597351\") " pod="openstack/root-account-create-update-hnpbd" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.049928 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fbe609f-6d91-4f8b-82b9-17a602597351-operator-scripts\") pod \"root-account-create-update-hnpbd\" (UID: \"5fbe609f-6d91-4f8b-82b9-17a602597351\") " pod="openstack/root-account-create-update-hnpbd" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.080160 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr9pc\" (UniqueName: \"kubernetes.io/projected/5fbe609f-6d91-4f8b-82b9-17a602597351-kube-api-access-kr9pc\") pod \"root-account-create-update-hnpbd\" (UID: \"5fbe609f-6d91-4f8b-82b9-17a602597351\") " pod="openstack/root-account-create-update-hnpbd" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.136352 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hnpbd" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.396105 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.445660 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7442-account-create-update-fqvpj" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.462175 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-64qzs" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.561549 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6hzdk" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.562926 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hjn2\" (UniqueName: \"kubernetes.io/projected/ca4f77fa-08f5-4c3c-9300-112440f9acc1-kube-api-access-7hjn2\") pod \"ca4f77fa-08f5-4c3c-9300-112440f9acc1\" (UID: \"ca4f77fa-08f5-4c3c-9300-112440f9acc1\") " Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.563172 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f34fd0f6-3c75-42ff-909d-32c6255e5c68-operator-scripts\") pod \"f34fd0f6-3c75-42ff-909d-32c6255e5c68\" (UID: \"f34fd0f6-3c75-42ff-909d-32c6255e5c68\") " Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.563301 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcmfq\" (UniqueName: \"kubernetes.io/projected/f34fd0f6-3c75-42ff-909d-32c6255e5c68-kube-api-access-rcmfq\") pod \"f34fd0f6-3c75-42ff-909d-32c6255e5c68\" (UID: \"f34fd0f6-3c75-42ff-909d-32c6255e5c68\") " Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.563414 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca4f77fa-08f5-4c3c-9300-112440f9acc1-operator-scripts\") pod \"ca4f77fa-08f5-4c3c-9300-112440f9acc1\" (UID: \"ca4f77fa-08f5-4c3c-9300-112440f9acc1\") " Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.563495 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-81c4-account-create-update-khzvr" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.564532 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f34fd0f6-3c75-42ff-909d-32c6255e5c68-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f34fd0f6-3c75-42ff-909d-32c6255e5c68" (UID: "f34fd0f6-3c75-42ff-909d-32c6255e5c68"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.565366 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca4f77fa-08f5-4c3c-9300-112440f9acc1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca4f77fa-08f5-4c3c-9300-112440f9acc1" (UID: "ca4f77fa-08f5-4c3c-9300-112440f9acc1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.571168 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34fd0f6-3c75-42ff-909d-32c6255e5c68-kube-api-access-rcmfq" (OuterVolumeSpecName: "kube-api-access-rcmfq") pod "f34fd0f6-3c75-42ff-909d-32c6255e5c68" (UID: "f34fd0f6-3c75-42ff-909d-32c6255e5c68"). InnerVolumeSpecName "kube-api-access-rcmfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.573143 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca4f77fa-08f5-4c3c-9300-112440f9acc1-kube-api-access-7hjn2" (OuterVolumeSpecName: "kube-api-access-7hjn2") pod "ca4f77fa-08f5-4c3c-9300-112440f9acc1" (UID: "ca4f77fa-08f5-4c3c-9300-112440f9acc1"). InnerVolumeSpecName "kube-api-access-7hjn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.665279 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf6c0d9f-d17b-458f-a653-713f834d70cc-operator-scripts\") pod \"bf6c0d9f-d17b-458f-a653-713f834d70cc\" (UID: \"bf6c0d9f-d17b-458f-a653-713f834d70cc\") " Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.665413 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76g86\" (UniqueName: \"kubernetes.io/projected/b6afa77b-c41a-42ad-a253-64cf7e6e5544-kube-api-access-76g86\") pod \"b6afa77b-c41a-42ad-a253-64cf7e6e5544\" (UID: \"b6afa77b-c41a-42ad-a253-64cf7e6e5544\") " Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.665480 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6afa77b-c41a-42ad-a253-64cf7e6e5544-operator-scripts\") pod \"b6afa77b-c41a-42ad-a253-64cf7e6e5544\" (UID: \"b6afa77b-c41a-42ad-a253-64cf7e6e5544\") " Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.665590 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blw6k\" (UniqueName: \"kubernetes.io/projected/bf6c0d9f-d17b-458f-a653-713f834d70cc-kube-api-access-blw6k\") pod \"bf6c0d9f-d17b-458f-a653-713f834d70cc\" (UID: \"bf6c0d9f-d17b-458f-a653-713f834d70cc\") " Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.666709 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hjn2\" (UniqueName: \"kubernetes.io/projected/ca4f77fa-08f5-4c3c-9300-112440f9acc1-kube-api-access-7hjn2\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.666736 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f34fd0f6-3c75-42ff-909d-32c6255e5c68-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.666751 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcmfq\" (UniqueName: \"kubernetes.io/projected/f34fd0f6-3c75-42ff-909d-32c6255e5c68-kube-api-access-rcmfq\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.666763 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca4f77fa-08f5-4c3c-9300-112440f9acc1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.671298 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf6c0d9f-d17b-458f-a653-713f834d70cc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf6c0d9f-d17b-458f-a653-713f834d70cc" (UID: "bf6c0d9f-d17b-458f-a653-713f834d70cc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.671746 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6afa77b-c41a-42ad-a253-64cf7e6e5544-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6afa77b-c41a-42ad-a253-64cf7e6e5544" (UID: "b6afa77b-c41a-42ad-a253-64cf7e6e5544"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.683289 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6afa77b-c41a-42ad-a253-64cf7e6e5544-kube-api-access-76g86" (OuterVolumeSpecName: "kube-api-access-76g86") pod "b6afa77b-c41a-42ad-a253-64cf7e6e5544" (UID: "b6afa77b-c41a-42ad-a253-64cf7e6e5544"). InnerVolumeSpecName "kube-api-access-76g86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.692217 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf6c0d9f-d17b-458f-a653-713f834d70cc-kube-api-access-blw6k" (OuterVolumeSpecName: "kube-api-access-blw6k") pod "bf6c0d9f-d17b-458f-a653-713f834d70cc" (UID: "bf6c0d9f-d17b-458f-a653-713f834d70cc"). InnerVolumeSpecName "kube-api-access-blw6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.750008 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-81c4-account-create-update-khzvr" event={"ID":"bf6c0d9f-d17b-458f-a653-713f834d70cc","Type":"ContainerDied","Data":"5ddf8a2b6396f10df91f4c8c3ed74e39b0cf9937ece1415cdc52d3e8993b9998"} Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.750334 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ddf8a2b6396f10df91f4c8c3ed74e39b0cf9937ece1415cdc52d3e8993b9998" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.750434 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-81c4-account-create-update-khzvr" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.764516 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7442-account-create-update-fqvpj" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.764518 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7442-account-create-update-fqvpj" event={"ID":"ca4f77fa-08f5-4c3c-9300-112440f9acc1","Type":"ContainerDied","Data":"5a9601f5e6ddfb5a476f0149419d47f3c5ed704b9cb4538122d1eea3defd32a9"} Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.764755 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a9601f5e6ddfb5a476f0149419d47f3c5ed704b9cb4538122d1eea3defd32a9" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.767971 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blw6k\" (UniqueName: \"kubernetes.io/projected/bf6c0d9f-d17b-458f-a653-713f834d70cc-kube-api-access-blw6k\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.768006 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf6c0d9f-d17b-458f-a653-713f834d70cc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.768019 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76g86\" (UniqueName: \"kubernetes.io/projected/b6afa77b-c41a-42ad-a253-64cf7e6e5544-kube-api-access-76g86\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.768030 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6afa77b-c41a-42ad-a253-64cf7e6e5544-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.769183 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6hzdk" event={"ID":"b6afa77b-c41a-42ad-a253-64cf7e6e5544","Type":"ContainerDied","Data":"0fcae70a1faf0f2f5d728a9a759f15acdda99e7698c833d5e00916419d9c699f"} Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.769216 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fcae70a1faf0f2f5d728a9a759f15acdda99e7698c833d5e00916419d9c699f" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.769280 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6hzdk" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.771805 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-64qzs" event={"ID":"f34fd0f6-3c75-42ff-909d-32c6255e5c68","Type":"ContainerDied","Data":"505f002dc9e2aafa3d514bf202eeba2bfa56722b73f335fcbddb11ff884bd786"} Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.771827 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="505f002dc9e2aafa3d514bf202eeba2bfa56722b73f335fcbddb11ff884bd786" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.771857 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-64qzs" Mar 21 05:11:40 crc kubenswrapper[4580]: I0321 05:11:40.973198 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.072634 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsscx\" (UniqueName: \"kubernetes.io/projected/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-kube-api-access-lsscx\") pod \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.072924 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-ovsdbserver-nb\") pod \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.072972 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-ovsdbserver-sb\") pod \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.073101 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-config\") pod \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.073166 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-dns-svc\") pod \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\" (UID: \"cf6e0a0b-20cd-4471-ada8-85566de1d2f9\") " Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.110848 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-kube-api-access-lsscx" (OuterVolumeSpecName: "kube-api-access-lsscx") pod "cf6e0a0b-20cd-4471-ada8-85566de1d2f9" (UID: "cf6e0a0b-20cd-4471-ada8-85566de1d2f9"). InnerVolumeSpecName "kube-api-access-lsscx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.185606 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf6e0a0b-20cd-4471-ada8-85566de1d2f9" (UID: "cf6e0a0b-20cd-4471-ada8-85566de1d2f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.204765 4580 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.204810 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsscx\" (UniqueName: \"kubernetes.io/projected/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-kube-api-access-lsscx\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.253924 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-config" (OuterVolumeSpecName: "config") pod "cf6e0a0b-20cd-4471-ada8-85566de1d2f9" (UID: "cf6e0a0b-20cd-4471-ada8-85566de1d2f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.256632 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mwp8r"] Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.267940 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf6e0a0b-20cd-4471-ada8-85566de1d2f9" (UID: "cf6e0a0b-20cd-4471-ada8-85566de1d2f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.270858 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf6e0a0b-20cd-4471-ada8-85566de1d2f9" (UID: "cf6e0a0b-20cd-4471-ada8-85566de1d2f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.330659 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.330689 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.330699 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf6e0a0b-20cd-4471-ada8-85566de1d2f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:41 crc kubenswrapper[4580]: E0321 05:11:41.381021 4580 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6afa77b_c41a_42ad_a253_64cf7e6e5544.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf6c0d9f_d17b_458f_a653_713f834d70cc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca4f77fa_08f5_4c3c_9300_112440f9acc1.slice/crio-5a9601f5e6ddfb5a476f0149419d47f3c5ed704b9cb4538122d1eea3defd32a9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf34fd0f6_3c75_42ff_909d_32c6255e5c68.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf34fd0f6_3c75_42ff_909d_32c6255e5c68.slice/crio-505f002dc9e2aafa3d514bf202eeba2bfa56722b73f335fcbddb11ff884bd786\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca4f77fa_08f5_4c3c_9300_112440f9acc1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf6c0d9f_d17b_458f_a653_713f834d70cc.slice/crio-5ddf8a2b6396f10df91f4c8c3ed74e39b0cf9937ece1415cdc52d3e8993b9998\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6afa77b_c41a_42ad_a253_64cf7e6e5544.slice/crio-0fcae70a1faf0f2f5d728a9a759f15acdda99e7698c833d5e00916419d9c699f\": RecentStats: unable to find data in memory cache]" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.491752 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jjv5q-config-2fhqf"] Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.587921 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8ced-account-create-update-rvjp7"] Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.636306 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6da794dd-3fdf-4146-a540-a13835a84b24" path="/var/lib/kubelet/pods/6da794dd-3fdf-4146-a540-a13835a84b24/volumes" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.636958 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hnpbd"] Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.793581 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" event={"ID":"cf6e0a0b-20cd-4471-ada8-85566de1d2f9","Type":"ContainerDied","Data":"9679be4b14716999a2c37f0b0c7cc3ceb34db23aa17a9ca73d6a42c9b5c64be2"} Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.793636 4580 scope.go:117] "RemoveContainer" containerID="46a0fdd4a9fdfcfd14c76379e351c2b3c98de0425473d9deabd125aa9ebef93a" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.793667 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-ftzww" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.802694 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kkh4r" event={"ID":"3d23c194-d398-4264-8726-c75316c85eff","Type":"ContainerStarted","Data":"4fbc26acabfdc474a897bf00ef5f1b4eeafff4620604a7c5200e6e6d41fa5afb"} Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.810097 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mwp8r" event={"ID":"d0d1dcc7-e806-4348-94ab-347efc9930b7","Type":"ContainerStarted","Data":"5216f1062d76a636ca1b786578bf61c096a09052c1ea40ecb1f3b558aad85422"} Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.810143 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mwp8r" event={"ID":"d0d1dcc7-e806-4348-94ab-347efc9930b7","Type":"ContainerStarted","Data":"02dbd68cd2bcc621e225e331ea6d33c61e95162c13b173e4b37d58f144805342"} Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.818647 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jjv5q-config-2fhqf" event={"ID":"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30","Type":"ContainerStarted","Data":"debe77220e3034f9542eb7dc1ec39e2ba7bf699687fefe1b21bfc9c5858b77ca"} Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.819849 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hnpbd" event={"ID":"5fbe609f-6d91-4f8b-82b9-17a602597351","Type":"ContainerStarted","Data":"544e9a7bfc535ad3ca07f641aa5abb9ca66dc183925d62191d0cf125a22c6b61"} Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.826612 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8ced-account-create-update-rvjp7" event={"ID":"6da615d4-b81c-4ad0-90b2-23e4029c949c","Type":"ContainerStarted","Data":"abe3b2e3bc8b27b3725f3ee8093e7c3be454930c40b745b415bb76a998eee056"} Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.831280 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-ftzww"] Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.851153 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-ftzww"] Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.859358 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-kkh4r" podStartSLOduration=3.16122412 podStartE2EDuration="9.859340748s" podCreationTimestamp="2026-03-21 05:11:32 +0000 UTC" firstStartedPulling="2026-03-21 05:11:33.859105727 +0000 UTC m=+1198.941689355" lastFinishedPulling="2026-03-21 05:11:40.557222355 +0000 UTC m=+1205.639805983" observedRunningTime="2026-03-21 05:11:41.853718568 +0000 UTC m=+1206.936302196" watchObservedRunningTime="2026-03-21 05:11:41.859340748 +0000 UTC m=+1206.941924376" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.860404 4580 scope.go:117] "RemoveContainer" containerID="497e850acbde60d4d897561c9ed640ae5d3b5d0a8a2b68d58e6f6920f7053c37" Mar 21 05:11:41 crc kubenswrapper[4580]: I0321 05:11:41.891697 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-mwp8r" podStartSLOduration=3.891678099 podStartE2EDuration="3.891678099s" podCreationTimestamp="2026-03-21 05:11:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:11:41.879747712 +0000 UTC m=+1206.962331340" watchObservedRunningTime="2026-03-21 05:11:41.891678099 +0000 UTC m=+1206.974261727" Mar 21 05:11:42 crc kubenswrapper[4580]: I0321 05:11:42.835478 4580 generic.go:334] "Generic (PLEG): container finished" podID="6da615d4-b81c-4ad0-90b2-23e4029c949c" containerID="75aed0c59374edf1569b77da32cdb9b7e67e7e26d83e53c2564c247d2d8910a6" exitCode=0 Mar 21 05:11:42 crc kubenswrapper[4580]: I0321 05:11:42.835603 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8ced-account-create-update-rvjp7" event={"ID":"6da615d4-b81c-4ad0-90b2-23e4029c949c","Type":"ContainerDied","Data":"75aed0c59374edf1569b77da32cdb9b7e67e7e26d83e53c2564c247d2d8910a6"} Mar 21 05:11:42 crc kubenswrapper[4580]: I0321 05:11:42.840192 4580 generic.go:334] "Generic (PLEG): container finished" podID="d0d1dcc7-e806-4348-94ab-347efc9930b7" containerID="5216f1062d76a636ca1b786578bf61c096a09052c1ea40ecb1f3b558aad85422" exitCode=0 Mar 21 05:11:42 crc kubenswrapper[4580]: I0321 05:11:42.840234 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mwp8r" event={"ID":"d0d1dcc7-e806-4348-94ab-347efc9930b7","Type":"ContainerDied","Data":"5216f1062d76a636ca1b786578bf61c096a09052c1ea40ecb1f3b558aad85422"} Mar 21 05:11:42 crc kubenswrapper[4580]: I0321 05:11:42.842233 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9c58cbe-bfe9-48e0-8cd4-983e565c2d30" containerID="bd8c14b7665fc06a8d12af5e3eba0ab892bd9185f2214c4c239e2a5951ee7b42" exitCode=0 Mar 21 05:11:42 crc kubenswrapper[4580]: I0321 05:11:42.842303 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jjv5q-config-2fhqf" event={"ID":"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30","Type":"ContainerDied","Data":"bd8c14b7665fc06a8d12af5e3eba0ab892bd9185f2214c4c239e2a5951ee7b42"} Mar 21 05:11:42 crc kubenswrapper[4580]: I0321 05:11:42.843591 4580 generic.go:334] "Generic (PLEG): container finished" podID="5fbe609f-6d91-4f8b-82b9-17a602597351" containerID="a4d48039e22c74290ecbab3c2c18b56cecfaa3b35f380680d35279a2419f88eb" exitCode=0 Mar 21 05:11:42 crc kubenswrapper[4580]: I0321 05:11:42.843631 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hnpbd" event={"ID":"5fbe609f-6d91-4f8b-82b9-17a602597351","Type":"ContainerDied","Data":"a4d48039e22c74290ecbab3c2c18b56cecfaa3b35f380680d35279a2419f88eb"} Mar 21 05:11:43 crc kubenswrapper[4580]: I0321 05:11:43.351585 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-jjv5q" Mar 21 05:11:43 crc kubenswrapper[4580]: I0321 05:11:43.627450 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf6e0a0b-20cd-4471-ada8-85566de1d2f9" path="/var/lib/kubelet/pods/cf6e0a0b-20cd-4471-ada8-85566de1d2f9/volumes" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.276815 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hnpbd" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.393049 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr9pc\" (UniqueName: \"kubernetes.io/projected/5fbe609f-6d91-4f8b-82b9-17a602597351-kube-api-access-kr9pc\") pod \"5fbe609f-6d91-4f8b-82b9-17a602597351\" (UID: \"5fbe609f-6d91-4f8b-82b9-17a602597351\") " Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.393170 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fbe609f-6d91-4f8b-82b9-17a602597351-operator-scripts\") pod \"5fbe609f-6d91-4f8b-82b9-17a602597351\" (UID: \"5fbe609f-6d91-4f8b-82b9-17a602597351\") " Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.393942 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fbe609f-6d91-4f8b-82b9-17a602597351-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5fbe609f-6d91-4f8b-82b9-17a602597351" (UID: "5fbe609f-6d91-4f8b-82b9-17a602597351"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.400993 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fbe609f-6d91-4f8b-82b9-17a602597351-kube-api-access-kr9pc" (OuterVolumeSpecName: "kube-api-access-kr9pc") pod "5fbe609f-6d91-4f8b-82b9-17a602597351" (UID: "5fbe609f-6d91-4f8b-82b9-17a602597351"). InnerVolumeSpecName "kube-api-access-kr9pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.498209 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fbe609f-6d91-4f8b-82b9-17a602597351-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.498235 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr9pc\" (UniqueName: \"kubernetes.io/projected/5fbe609f-6d91-4f8b-82b9-17a602597351-kube-api-access-kr9pc\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.576424 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8ced-account-create-update-rvjp7" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.600399 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.688474 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mwp8r" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.701483 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-scripts\") pod \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.701616 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f2rk\" (UniqueName: \"kubernetes.io/projected/6da615d4-b81c-4ad0-90b2-23e4029c949c-kube-api-access-2f2rk\") pod \"6da615d4-b81c-4ad0-90b2-23e4029c949c\" (UID: \"6da615d4-b81c-4ad0-90b2-23e4029c949c\") " Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.701645 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x76b7\" (UniqueName: \"kubernetes.io/projected/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-kube-api-access-x76b7\") pod \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.701680 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-run\") pod \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.701721 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-additional-scripts\") pod \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.701756 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6da615d4-b81c-4ad0-90b2-23e4029c949c-operator-scripts\") pod \"6da615d4-b81c-4ad0-90b2-23e4029c949c\" (UID: \"6da615d4-b81c-4ad0-90b2-23e4029c949c\") " Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.701882 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-log-ovn\") pod \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.701911 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-run-ovn\") pod \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\" (UID: \"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30\") " Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.702414 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a9c58cbe-bfe9-48e0-8cd4-983e565c2d30" (UID: "a9c58cbe-bfe9-48e0-8cd4-983e565c2d30"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.703538 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-scripts" (OuterVolumeSpecName: "scripts") pod "a9c58cbe-bfe9-48e0-8cd4-983e565c2d30" (UID: "a9c58cbe-bfe9-48e0-8cd4-983e565c2d30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.711120 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6da615d4-b81c-4ad0-90b2-23e4029c949c-kube-api-access-2f2rk" (OuterVolumeSpecName: "kube-api-access-2f2rk") pod "6da615d4-b81c-4ad0-90b2-23e4029c949c" (UID: "6da615d4-b81c-4ad0-90b2-23e4029c949c"). InnerVolumeSpecName "kube-api-access-2f2rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.712252 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a9c58cbe-bfe9-48e0-8cd4-983e565c2d30" (UID: "a9c58cbe-bfe9-48e0-8cd4-983e565c2d30"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.712342 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a9c58cbe-bfe9-48e0-8cd4-983e565c2d30" (UID: "a9c58cbe-bfe9-48e0-8cd4-983e565c2d30"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.712894 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6da615d4-b81c-4ad0-90b2-23e4029c949c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6da615d4-b81c-4ad0-90b2-23e4029c949c" (UID: "6da615d4-b81c-4ad0-90b2-23e4029c949c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.712296 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-run" (OuterVolumeSpecName: "var-run") pod "a9c58cbe-bfe9-48e0-8cd4-983e565c2d30" (UID: "a9c58cbe-bfe9-48e0-8cd4-983e565c2d30"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.714514 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-kube-api-access-x76b7" (OuterVolumeSpecName: "kube-api-access-x76b7") pod "a9c58cbe-bfe9-48e0-8cd4-983e565c2d30" (UID: "a9c58cbe-bfe9-48e0-8cd4-983e565c2d30"). InnerVolumeSpecName "kube-api-access-x76b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.803817 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0d1dcc7-e806-4348-94ab-347efc9930b7-operator-scripts\") pod \"d0d1dcc7-e806-4348-94ab-347efc9930b7\" (UID: \"d0d1dcc7-e806-4348-94ab-347efc9930b7\") " Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.803947 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgsp6\" (UniqueName: \"kubernetes.io/projected/d0d1dcc7-e806-4348-94ab-347efc9930b7-kube-api-access-lgsp6\") pod \"d0d1dcc7-e806-4348-94ab-347efc9930b7\" (UID: \"d0d1dcc7-e806-4348-94ab-347efc9930b7\") " Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.804340 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0d1dcc7-e806-4348-94ab-347efc9930b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0d1dcc7-e806-4348-94ab-347efc9930b7" (UID: "d0d1dcc7-e806-4348-94ab-347efc9930b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.804394 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6da615d4-b81c-4ad0-90b2-23e4029c949c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.804412 4580 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.804423 4580 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.804434 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.804444 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f2rk\" (UniqueName: \"kubernetes.io/projected/6da615d4-b81c-4ad0-90b2-23e4029c949c-kube-api-access-2f2rk\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.804456 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x76b7\" (UniqueName: \"kubernetes.io/projected/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-kube-api-access-x76b7\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.804467 4580 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-var-run\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.804478 4580 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.806982 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0d1dcc7-e806-4348-94ab-347efc9930b7-kube-api-access-lgsp6" (OuterVolumeSpecName: "kube-api-access-lgsp6") pod "d0d1dcc7-e806-4348-94ab-347efc9930b7" (UID: "d0d1dcc7-e806-4348-94ab-347efc9930b7"). InnerVolumeSpecName "kube-api-access-lgsp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.861810 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mwp8r" event={"ID":"d0d1dcc7-e806-4348-94ab-347efc9930b7","Type":"ContainerDied","Data":"02dbd68cd2bcc621e225e331ea6d33c61e95162c13b173e4b37d58f144805342"} Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.861841 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mwp8r" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.861854 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02dbd68cd2bcc621e225e331ea6d33c61e95162c13b173e4b37d58f144805342" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.863658 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jjv5q-config-2fhqf" event={"ID":"a9c58cbe-bfe9-48e0-8cd4-983e565c2d30","Type":"ContainerDied","Data":"debe77220e3034f9542eb7dc1ec39e2ba7bf699687fefe1b21bfc9c5858b77ca"} Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.863699 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="debe77220e3034f9542eb7dc1ec39e2ba7bf699687fefe1b21bfc9c5858b77ca" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.863726 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jjv5q-config-2fhqf" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.865263 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hnpbd" event={"ID":"5fbe609f-6d91-4f8b-82b9-17a602597351","Type":"ContainerDied","Data":"544e9a7bfc535ad3ca07f641aa5abb9ca66dc183925d62191d0cf125a22c6b61"} Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.865287 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="544e9a7bfc535ad3ca07f641aa5abb9ca66dc183925d62191d0cf125a22c6b61" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.865381 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hnpbd" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.868211 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8ced-account-create-update-rvjp7" event={"ID":"6da615d4-b81c-4ad0-90b2-23e4029c949c","Type":"ContainerDied","Data":"abe3b2e3bc8b27b3725f3ee8093e7c3be454930c40b745b415bb76a998eee056"} Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.868256 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abe3b2e3bc8b27b3725f3ee8093e7c3be454930c40b745b415bb76a998eee056" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.868257 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8ced-account-create-update-rvjp7" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.905952 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0d1dcc7-e806-4348-94ab-347efc9930b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:44 crc kubenswrapper[4580]: I0321 05:11:44.905992 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgsp6\" (UniqueName: \"kubernetes.io/projected/d0d1dcc7-e806-4348-94ab-347efc9930b7-kube-api-access-lgsp6\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:45 crc kubenswrapper[4580]: I0321 05:11:45.007537 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:11:45 crc kubenswrapper[4580]: E0321 05:11:45.008002 4580 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 05:11:45 crc kubenswrapper[4580]: E0321 05:11:45.008093 4580 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 05:11:45 crc kubenswrapper[4580]: E0321 05:11:45.008209 4580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift podName:d59ab798-9ae9-4f47-b58b-36417592eef2 nodeName:}" failed. No retries permitted until 2026-03-21 05:12:01.008188086 +0000 UTC m=+1226.090771714 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift") pod "swift-storage-0" (UID: "d59ab798-9ae9-4f47-b58b-36417592eef2") : configmap "swift-ring-files" not found Mar 21 05:11:45 crc kubenswrapper[4580]: I0321 05:11:45.804440 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-jjv5q-config-2fhqf"] Mar 21 05:11:45 crc kubenswrapper[4580]: I0321 05:11:45.816356 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-jjv5q-config-2fhqf"] Mar 21 05:11:47 crc kubenswrapper[4580]: I0321 05:11:47.629797 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9c58cbe-bfe9-48e0-8cd4-983e565c2d30" path="/var/lib/kubelet/pods/a9c58cbe-bfe9-48e0-8cd4-983e565c2d30/volumes" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.719790 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-n4vzq"] Mar 21 05:11:48 crc kubenswrapper[4580]: E0321 05:11:48.720356 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6e0a0b-20cd-4471-ada8-85566de1d2f9" containerName="init" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720371 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6e0a0b-20cd-4471-ada8-85566de1d2f9" containerName="init" Mar 21 05:11:48 crc kubenswrapper[4580]: E0321 05:11:48.720385 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6c0d9f-d17b-458f-a653-713f834d70cc" containerName="mariadb-account-create-update" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720391 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6c0d9f-d17b-458f-a653-713f834d70cc" containerName="mariadb-account-create-update" Mar 21 05:11:48 crc kubenswrapper[4580]: E0321 05:11:48.720403 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da615d4-b81c-4ad0-90b2-23e4029c949c" containerName="mariadb-account-create-update" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720409 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da615d4-b81c-4ad0-90b2-23e4029c949c" containerName="mariadb-account-create-update" Mar 21 05:11:48 crc kubenswrapper[4580]: E0321 05:11:48.720417 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d1dcc7-e806-4348-94ab-347efc9930b7" containerName="mariadb-database-create" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720423 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d1dcc7-e806-4348-94ab-347efc9930b7" containerName="mariadb-database-create" Mar 21 05:11:48 crc kubenswrapper[4580]: E0321 05:11:48.720434 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4f77fa-08f5-4c3c-9300-112440f9acc1" containerName="mariadb-account-create-update" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720439 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4f77fa-08f5-4c3c-9300-112440f9acc1" containerName="mariadb-account-create-update" Mar 21 05:11:48 crc kubenswrapper[4580]: E0321 05:11:48.720453 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fbe609f-6d91-4f8b-82b9-17a602597351" containerName="mariadb-account-create-update" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720459 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fbe609f-6d91-4f8b-82b9-17a602597351" containerName="mariadb-account-create-update" Mar 21 05:11:48 crc kubenswrapper[4580]: E0321 05:11:48.720472 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6afa77b-c41a-42ad-a253-64cf7e6e5544" containerName="mariadb-database-create" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720478 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6afa77b-c41a-42ad-a253-64cf7e6e5544" containerName="mariadb-database-create" Mar 21 05:11:48 crc kubenswrapper[4580]: E0321 05:11:48.720483 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6e0a0b-20cd-4471-ada8-85566de1d2f9" containerName="dnsmasq-dns" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720489 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6e0a0b-20cd-4471-ada8-85566de1d2f9" containerName="dnsmasq-dns" Mar 21 05:11:48 crc kubenswrapper[4580]: E0321 05:11:48.720501 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9c58cbe-bfe9-48e0-8cd4-983e565c2d30" containerName="ovn-config" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720507 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c58cbe-bfe9-48e0-8cd4-983e565c2d30" containerName="ovn-config" Mar 21 05:11:48 crc kubenswrapper[4580]: E0321 05:11:48.720515 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34fd0f6-3c75-42ff-909d-32c6255e5c68" containerName="mariadb-database-create" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720520 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34fd0f6-3c75-42ff-909d-32c6255e5c68" containerName="mariadb-database-create" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720657 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="6da615d4-b81c-4ad0-90b2-23e4029c949c" containerName="mariadb-account-create-update" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720669 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf6c0d9f-d17b-458f-a653-713f834d70cc" containerName="mariadb-account-create-update" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720682 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34fd0f6-3c75-42ff-909d-32c6255e5c68" containerName="mariadb-database-create" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720691 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9c58cbe-bfe9-48e0-8cd4-983e565c2d30" containerName="ovn-config" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720700 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0d1dcc7-e806-4348-94ab-347efc9930b7" containerName="mariadb-database-create" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720707 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf6e0a0b-20cd-4471-ada8-85566de1d2f9" containerName="dnsmasq-dns" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720715 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6afa77b-c41a-42ad-a253-64cf7e6e5544" containerName="mariadb-database-create" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720724 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca4f77fa-08f5-4c3c-9300-112440f9acc1" containerName="mariadb-account-create-update" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.720730 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fbe609f-6d91-4f8b-82b9-17a602597351" containerName="mariadb-account-create-update" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.721208 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n4vzq" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.729416 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.729470 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-65qpr" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.745486 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-n4vzq"] Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.877868 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-db-sync-config-data\") pod \"glance-db-sync-n4vzq\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " pod="openstack/glance-db-sync-n4vzq" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.877920 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-combined-ca-bundle\") pod \"glance-db-sync-n4vzq\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " pod="openstack/glance-db-sync-n4vzq" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.878059 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h95zr\" (UniqueName: \"kubernetes.io/projected/f86c26eb-bb22-460c-8a45-191a02924112-kube-api-access-h95zr\") pod \"glance-db-sync-n4vzq\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " pod="openstack/glance-db-sync-n4vzq" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.878330 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-config-data\") pod \"glance-db-sync-n4vzq\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " pod="openstack/glance-db-sync-n4vzq" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.979436 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-db-sync-config-data\") pod \"glance-db-sync-n4vzq\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " pod="openstack/glance-db-sync-n4vzq" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.979493 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-combined-ca-bundle\") pod \"glance-db-sync-n4vzq\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " pod="openstack/glance-db-sync-n4vzq" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.979552 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h95zr\" (UniqueName: \"kubernetes.io/projected/f86c26eb-bb22-460c-8a45-191a02924112-kube-api-access-h95zr\") pod \"glance-db-sync-n4vzq\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " pod="openstack/glance-db-sync-n4vzq" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.979623 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-config-data\") pod \"glance-db-sync-n4vzq\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " pod="openstack/glance-db-sync-n4vzq" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.986197 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-db-sync-config-data\") pod \"glance-db-sync-n4vzq\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " pod="openstack/glance-db-sync-n4vzq" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.986581 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-combined-ca-bundle\") pod \"glance-db-sync-n4vzq\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " pod="openstack/glance-db-sync-n4vzq" Mar 21 05:11:48 crc kubenswrapper[4580]: I0321 05:11:48.987018 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-config-data\") pod \"glance-db-sync-n4vzq\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " pod="openstack/glance-db-sync-n4vzq" Mar 21 05:11:49 crc kubenswrapper[4580]: I0321 05:11:49.004467 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h95zr\" (UniqueName: \"kubernetes.io/projected/f86c26eb-bb22-460c-8a45-191a02924112-kube-api-access-h95zr\") pod \"glance-db-sync-n4vzq\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " pod="openstack/glance-db-sync-n4vzq" Mar 21 05:11:49 crc kubenswrapper[4580]: I0321 05:11:49.043584 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n4vzq" Mar 21 05:11:49 crc kubenswrapper[4580]: W0321 05:11:49.631904 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf86c26eb_bb22_460c_8a45_191a02924112.slice/crio-c811f7c0d3e27ca679edddcfe96806d12a390e73a0bf11a30d4778cf3907d7f2 WatchSource:0}: Error finding container c811f7c0d3e27ca679edddcfe96806d12a390e73a0bf11a30d4778cf3907d7f2: Status 404 returned error can't find the container with id c811f7c0d3e27ca679edddcfe96806d12a390e73a0bf11a30d4778cf3907d7f2 Mar 21 05:11:49 crc kubenswrapper[4580]: I0321 05:11:49.632064 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-n4vzq"] Mar 21 05:11:49 crc kubenswrapper[4580]: I0321 05:11:49.804968 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:11:49 crc kubenswrapper[4580]: I0321 05:11:49.908524 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n4vzq" event={"ID":"f86c26eb-bb22-460c-8a45-191a02924112","Type":"ContainerStarted","Data":"c811f7c0d3e27ca679edddcfe96806d12a390e73a0bf11a30d4778cf3907d7f2"} Mar 21 05:11:50 crc kubenswrapper[4580]: I0321 05:11:50.394959 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 21 05:11:50 crc kubenswrapper[4580]: I0321 05:11:50.931564 4580 generic.go:334] "Generic (PLEG): container finished" podID="3d23c194-d398-4264-8726-c75316c85eff" containerID="4fbc26acabfdc474a897bf00ef5f1b4eeafff4620604a7c5200e6e6d41fa5afb" exitCode=0 Mar 21 05:11:50 crc kubenswrapper[4580]: I0321 05:11:50.931613 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kkh4r" event={"ID":"3d23c194-d398-4264-8726-c75316c85eff","Type":"ContainerDied","Data":"4fbc26acabfdc474a897bf00ef5f1b4eeafff4620604a7c5200e6e6d41fa5afb"} Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.179473 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-smbzn"] Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.180803 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-smbzn" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.199437 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-smbzn"] Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.237450 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e111d88-8363-418a-aa16-e4738fc5dfa5-operator-scripts\") pod \"cinder-db-create-smbzn\" (UID: \"8e111d88-8363-418a-aa16-e4738fc5dfa5\") " pod="openstack/cinder-db-create-smbzn" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.237774 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd7s5\" (UniqueName: \"kubernetes.io/projected/8e111d88-8363-418a-aa16-e4738fc5dfa5-kube-api-access-vd7s5\") pod \"cinder-db-create-smbzn\" (UID: \"8e111d88-8363-418a-aa16-e4738fc5dfa5\") " pod="openstack/cinder-db-create-smbzn" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.311536 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-kkck5"] Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.312848 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-kkck5" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.338899 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e111d88-8363-418a-aa16-e4738fc5dfa5-operator-scripts\") pod \"cinder-db-create-smbzn\" (UID: \"8e111d88-8363-418a-aa16-e4738fc5dfa5\") " pod="openstack/cinder-db-create-smbzn" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.339330 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd7s5\" (UniqueName: \"kubernetes.io/projected/8e111d88-8363-418a-aa16-e4738fc5dfa5-kube-api-access-vd7s5\") pod \"cinder-db-create-smbzn\" (UID: \"8e111d88-8363-418a-aa16-e4738fc5dfa5\") " pod="openstack/cinder-db-create-smbzn" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.339612 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e111d88-8363-418a-aa16-e4738fc5dfa5-operator-scripts\") pod \"cinder-db-create-smbzn\" (UID: \"8e111d88-8363-418a-aa16-e4738fc5dfa5\") " pod="openstack/cinder-db-create-smbzn" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.363639 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-kkck5"] Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.397535 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd7s5\" (UniqueName: \"kubernetes.io/projected/8e111d88-8363-418a-aa16-e4738fc5dfa5-kube-api-access-vd7s5\") pod \"cinder-db-create-smbzn\" (UID: \"8e111d88-8363-418a-aa16-e4738fc5dfa5\") " pod="openstack/cinder-db-create-smbzn" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.450134 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949065da-d504-4d31-800b-cfb6b82bb559-operator-scripts\") pod \"neutron-db-create-kkck5\" (UID: \"949065da-d504-4d31-800b-cfb6b82bb559\") " pod="openstack/neutron-db-create-kkck5" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.450225 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b67m\" (UniqueName: \"kubernetes.io/projected/949065da-d504-4d31-800b-cfb6b82bb559-kube-api-access-8b67m\") pod \"neutron-db-create-kkck5\" (UID: \"949065da-d504-4d31-800b-cfb6b82bb559\") " pod="openstack/neutron-db-create-kkck5" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.502026 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-38bf-account-create-update-pwp74"] Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.503121 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-38bf-account-create-update-pwp74" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.508993 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.515974 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-38bf-account-create-update-pwp74"] Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.526885 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-smbzn" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.557948 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949065da-d504-4d31-800b-cfb6b82bb559-operator-scripts\") pod \"neutron-db-create-kkck5\" (UID: \"949065da-d504-4d31-800b-cfb6b82bb559\") " pod="openstack/neutron-db-create-kkck5" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.558620 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949065da-d504-4d31-800b-cfb6b82bb559-operator-scripts\") pod \"neutron-db-create-kkck5\" (UID: \"949065da-d504-4d31-800b-cfb6b82bb559\") " pod="openstack/neutron-db-create-kkck5" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.558714 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b67m\" (UniqueName: \"kubernetes.io/projected/949065da-d504-4d31-800b-cfb6b82bb559-kube-api-access-8b67m\") pod \"neutron-db-create-kkck5\" (UID: \"949065da-d504-4d31-800b-cfb6b82bb559\") " pod="openstack/neutron-db-create-kkck5" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.665058 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtx6c\" (UniqueName: \"kubernetes.io/projected/3b154885-0b18-4926-b81c-c10208075c27-kube-api-access-rtx6c\") pod \"cinder-38bf-account-create-update-pwp74\" (UID: \"3b154885-0b18-4926-b81c-c10208075c27\") " pod="openstack/cinder-38bf-account-create-update-pwp74" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.665363 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b154885-0b18-4926-b81c-c10208075c27-operator-scripts\") pod \"cinder-38bf-account-create-update-pwp74\" (UID: \"3b154885-0b18-4926-b81c-c10208075c27\") " pod="openstack/cinder-38bf-account-create-update-pwp74" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.672862 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b67m\" (UniqueName: \"kubernetes.io/projected/949065da-d504-4d31-800b-cfb6b82bb559-kube-api-access-8b67m\") pod \"neutron-db-create-kkck5\" (UID: \"949065da-d504-4d31-800b-cfb6b82bb559\") " pod="openstack/neutron-db-create-kkck5" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.708041 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0055-account-create-update-jxb6l"] Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.709225 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0055-account-create-update-jxb6l" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.721206 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.735501 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-26d4d"] Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.736712 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-26d4d" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.756036 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0055-account-create-update-jxb6l"] Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.768538 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b154885-0b18-4926-b81c-c10208075c27-operator-scripts\") pod \"cinder-38bf-account-create-update-pwp74\" (UID: \"3b154885-0b18-4926-b81c-c10208075c27\") " pod="openstack/cinder-38bf-account-create-update-pwp74" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.768602 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtx6c\" (UniqueName: \"kubernetes.io/projected/3b154885-0b18-4926-b81c-c10208075c27-kube-api-access-rtx6c\") pod \"cinder-38bf-account-create-update-pwp74\" (UID: \"3b154885-0b18-4926-b81c-c10208075c27\") " pod="openstack/cinder-38bf-account-create-update-pwp74" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.769469 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-26d4d"] Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.770843 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b154885-0b18-4926-b81c-c10208075c27-operator-scripts\") pod \"cinder-38bf-account-create-update-pwp74\" (UID: \"3b154885-0b18-4926-b81c-c10208075c27\") " pod="openstack/cinder-38bf-account-create-update-pwp74" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.836676 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtx6c\" (UniqueName: \"kubernetes.io/projected/3b154885-0b18-4926-b81c-c10208075c27-kube-api-access-rtx6c\") pod \"cinder-38bf-account-create-update-pwp74\" (UID: \"3b154885-0b18-4926-b81c-c10208075c27\") " pod="openstack/cinder-38bf-account-create-update-pwp74" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.848153 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-38bf-account-create-update-pwp74" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.883031 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec630d4-9254-4a93-b61f-d960ac2b3ccc-operator-scripts\") pod \"barbican-db-create-26d4d\" (UID: \"1ec630d4-9254-4a93-b61f-d960ac2b3ccc\") " pod="openstack/barbican-db-create-26d4d" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.883184 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s9v5\" (UniqueName: \"kubernetes.io/projected/486431e9-d00c-4a61-b831-619c73ef470f-kube-api-access-6s9v5\") pod \"neutron-0055-account-create-update-jxb6l\" (UID: \"486431e9-d00c-4a61-b831-619c73ef470f\") " pod="openstack/neutron-0055-account-create-update-jxb6l" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.883275 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvr24\" (UniqueName: \"kubernetes.io/projected/1ec630d4-9254-4a93-b61f-d960ac2b3ccc-kube-api-access-rvr24\") pod \"barbican-db-create-26d4d\" (UID: \"1ec630d4-9254-4a93-b61f-d960ac2b3ccc\") " pod="openstack/barbican-db-create-26d4d" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.883332 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486431e9-d00c-4a61-b831-619c73ef470f-operator-scripts\") pod \"neutron-0055-account-create-update-jxb6l\" (UID: \"486431e9-d00c-4a61-b831-619c73ef470f\") " pod="openstack/neutron-0055-account-create-update-jxb6l" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.892512 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.910910 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-zxh5l"] Mar 21 05:11:52 crc kubenswrapper[4580]: E0321 05:11:52.911850 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d23c194-d398-4264-8726-c75316c85eff" containerName="swift-ring-rebalance" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.911870 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d23c194-d398-4264-8726-c75316c85eff" containerName="swift-ring-rebalance" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.912212 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d23c194-d398-4264-8726-c75316c85eff" containerName="swift-ring-rebalance" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.914241 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zxh5l" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.921191 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sqx2b" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.923617 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zxh5l"] Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.923904 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.924012 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.924071 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.940154 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-kkck5" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.984810 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-combined-ca-bundle\") pod \"3d23c194-d398-4264-8726-c75316c85eff\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.984859 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-swiftconf\") pod \"3d23c194-d398-4264-8726-c75316c85eff\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.984893 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-dispersionconf\") pod \"3d23c194-d398-4264-8726-c75316c85eff\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.984942 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxpjh\" (UniqueName: \"kubernetes.io/projected/3d23c194-d398-4264-8726-c75316c85eff-kube-api-access-cxpjh\") pod \"3d23c194-d398-4264-8726-c75316c85eff\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.984965 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3d23c194-d398-4264-8726-c75316c85eff-etc-swift\") pod \"3d23c194-d398-4264-8726-c75316c85eff\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.985038 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d23c194-d398-4264-8726-c75316c85eff-scripts\") pod \"3d23c194-d398-4264-8726-c75316c85eff\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.985058 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3d23c194-d398-4264-8726-c75316c85eff-ring-data-devices\") pod \"3d23c194-d398-4264-8726-c75316c85eff\" (UID: \"3d23c194-d398-4264-8726-c75316c85eff\") " Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.985237 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d29c081-c37a-46ac-8354-178685882ce2-config-data\") pod \"keystone-db-sync-zxh5l\" (UID: \"0d29c081-c37a-46ac-8354-178685882ce2\") " pod="openstack/keystone-db-sync-zxh5l" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.985271 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486431e9-d00c-4a61-b831-619c73ef470f-operator-scripts\") pod \"neutron-0055-account-create-update-jxb6l\" (UID: \"486431e9-d00c-4a61-b831-619c73ef470f\") " pod="openstack/neutron-0055-account-create-update-jxb6l" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.985292 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec630d4-9254-4a93-b61f-d960ac2b3ccc-operator-scripts\") pod \"barbican-db-create-26d4d\" (UID: \"1ec630d4-9254-4a93-b61f-d960ac2b3ccc\") " pod="openstack/barbican-db-create-26d4d" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.985338 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79nmt\" (UniqueName: \"kubernetes.io/projected/0d29c081-c37a-46ac-8354-178685882ce2-kube-api-access-79nmt\") pod \"keystone-db-sync-zxh5l\" (UID: \"0d29c081-c37a-46ac-8354-178685882ce2\") " pod="openstack/keystone-db-sync-zxh5l" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.985428 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s9v5\" (UniqueName: \"kubernetes.io/projected/486431e9-d00c-4a61-b831-619c73ef470f-kube-api-access-6s9v5\") pod \"neutron-0055-account-create-update-jxb6l\" (UID: \"486431e9-d00c-4a61-b831-619c73ef470f\") " pod="openstack/neutron-0055-account-create-update-jxb6l" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.985486 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvr24\" (UniqueName: \"kubernetes.io/projected/1ec630d4-9254-4a93-b61f-d960ac2b3ccc-kube-api-access-rvr24\") pod \"barbican-db-create-26d4d\" (UID: \"1ec630d4-9254-4a93-b61f-d960ac2b3ccc\") " pod="openstack/barbican-db-create-26d4d" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.985507 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d29c081-c37a-46ac-8354-178685882ce2-combined-ca-bundle\") pod \"keystone-db-sync-zxh5l\" (UID: \"0d29c081-c37a-46ac-8354-178685882ce2\") " pod="openstack/keystone-db-sync-zxh5l" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.987192 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d23c194-d398-4264-8726-c75316c85eff-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3d23c194-d398-4264-8726-c75316c85eff" (UID: "3d23c194-d398-4264-8726-c75316c85eff"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.993568 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d23c194-d398-4264-8726-c75316c85eff-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3d23c194-d398-4264-8726-c75316c85eff" (UID: "3d23c194-d398-4264-8726-c75316c85eff"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.996685 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec630d4-9254-4a93-b61f-d960ac2b3ccc-operator-scripts\") pod \"barbican-db-create-26d4d\" (UID: \"1ec630d4-9254-4a93-b61f-d960ac2b3ccc\") " pod="openstack/barbican-db-create-26d4d" Mar 21 05:11:52 crc kubenswrapper[4580]: I0321 05:11:52.997265 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486431e9-d00c-4a61-b831-619c73ef470f-operator-scripts\") pod \"neutron-0055-account-create-update-jxb6l\" (UID: \"486431e9-d00c-4a61-b831-619c73ef470f\") " pod="openstack/neutron-0055-account-create-update-jxb6l" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.020010 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-01e5-account-create-update-ptjxt"] Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.021473 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-01e5-account-create-update-ptjxt" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.024904 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.033630 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d23c194-d398-4264-8726-c75316c85eff-kube-api-access-cxpjh" (OuterVolumeSpecName: "kube-api-access-cxpjh") pod "3d23c194-d398-4264-8726-c75316c85eff" (UID: "3d23c194-d398-4264-8726-c75316c85eff"). InnerVolumeSpecName "kube-api-access-cxpjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.036288 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kkh4r" event={"ID":"3d23c194-d398-4264-8726-c75316c85eff","Type":"ContainerDied","Data":"ebcebd15d2a6a05311db51a79f4975085eb6ba00ff8619c9ae1dcf69896f7594"} Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.036323 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebcebd15d2a6a05311db51a79f4975085eb6ba00ff8619c9ae1dcf69896f7594" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.036476 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kkh4r" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.091340 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvr24\" (UniqueName: \"kubernetes.io/projected/1ec630d4-9254-4a93-b61f-d960ac2b3ccc-kube-api-access-rvr24\") pod \"barbican-db-create-26d4d\" (UID: \"1ec630d4-9254-4a93-b61f-d960ac2b3ccc\") " pod="openstack/barbican-db-create-26d4d" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.091853 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s9v5\" (UniqueName: \"kubernetes.io/projected/486431e9-d00c-4a61-b831-619c73ef470f-kube-api-access-6s9v5\") pod \"neutron-0055-account-create-update-jxb6l\" (UID: \"486431e9-d00c-4a61-b831-619c73ef470f\") " pod="openstack/neutron-0055-account-create-update-jxb6l" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.092498 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79nmt\" (UniqueName: \"kubernetes.io/projected/0d29c081-c37a-46ac-8354-178685882ce2-kube-api-access-79nmt\") pod \"keystone-db-sync-zxh5l\" (UID: \"0d29c081-c37a-46ac-8354-178685882ce2\") " pod="openstack/keystone-db-sync-zxh5l" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.092614 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33-operator-scripts\") pod \"barbican-01e5-account-create-update-ptjxt\" (UID: \"8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33\") " pod="openstack/barbican-01e5-account-create-update-ptjxt" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.092655 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d29c081-c37a-46ac-8354-178685882ce2-combined-ca-bundle\") pod \"keystone-db-sync-zxh5l\" (UID: \"0d29c081-c37a-46ac-8354-178685882ce2\") " pod="openstack/keystone-db-sync-zxh5l" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.092677 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d29c081-c37a-46ac-8354-178685882ce2-config-data\") pod \"keystone-db-sync-zxh5l\" (UID: \"0d29c081-c37a-46ac-8354-178685882ce2\") " pod="openstack/keystone-db-sync-zxh5l" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.092704 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv5g4\" (UniqueName: \"kubernetes.io/projected/8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33-kube-api-access-qv5g4\") pod \"barbican-01e5-account-create-update-ptjxt\" (UID: \"8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33\") " pod="openstack/barbican-01e5-account-create-update-ptjxt" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.092743 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxpjh\" (UniqueName: \"kubernetes.io/projected/3d23c194-d398-4264-8726-c75316c85eff-kube-api-access-cxpjh\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.092753 4580 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3d23c194-d398-4264-8726-c75316c85eff-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.092761 4580 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3d23c194-d398-4264-8726-c75316c85eff-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.093737 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3d23c194-d398-4264-8726-c75316c85eff" (UID: "3d23c194-d398-4264-8726-c75316c85eff"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.106666 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d23c194-d398-4264-8726-c75316c85eff" (UID: "3d23c194-d398-4264-8726-c75316c85eff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.111072 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d29c081-c37a-46ac-8354-178685882ce2-combined-ca-bundle\") pod \"keystone-db-sync-zxh5l\" (UID: \"0d29c081-c37a-46ac-8354-178685882ce2\") " pod="openstack/keystone-db-sync-zxh5l" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.115412 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-01e5-account-create-update-ptjxt"] Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.124909 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79nmt\" (UniqueName: \"kubernetes.io/projected/0d29c081-c37a-46ac-8354-178685882ce2-kube-api-access-79nmt\") pod \"keystone-db-sync-zxh5l\" (UID: \"0d29c081-c37a-46ac-8354-178685882ce2\") " pod="openstack/keystone-db-sync-zxh5l" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.146365 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d29c081-c37a-46ac-8354-178685882ce2-config-data\") pod \"keystone-db-sync-zxh5l\" (UID: \"0d29c081-c37a-46ac-8354-178685882ce2\") " pod="openstack/keystone-db-sync-zxh5l" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.168458 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d23c194-d398-4264-8726-c75316c85eff-scripts" (OuterVolumeSpecName: "scripts") pod "3d23c194-d398-4264-8726-c75316c85eff" (UID: "3d23c194-d398-4264-8726-c75316c85eff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.183843 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3d23c194-d398-4264-8726-c75316c85eff" (UID: "3d23c194-d398-4264-8726-c75316c85eff"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.197514 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv5g4\" (UniqueName: \"kubernetes.io/projected/8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33-kube-api-access-qv5g4\") pod \"barbican-01e5-account-create-update-ptjxt\" (UID: \"8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33\") " pod="openstack/barbican-01e5-account-create-update-ptjxt" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.197645 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33-operator-scripts\") pod \"barbican-01e5-account-create-update-ptjxt\" (UID: \"8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33\") " pod="openstack/barbican-01e5-account-create-update-ptjxt" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.197712 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d23c194-d398-4264-8726-c75316c85eff-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.197724 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.197735 4580 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.197744 4580 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3d23c194-d398-4264-8726-c75316c85eff-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.198324 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33-operator-scripts\") pod \"barbican-01e5-account-create-update-ptjxt\" (UID: \"8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33\") " pod="openstack/barbican-01e5-account-create-update-ptjxt" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.248471 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv5g4\" (UniqueName: \"kubernetes.io/projected/8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33-kube-api-access-qv5g4\") pod \"barbican-01e5-account-create-update-ptjxt\" (UID: \"8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33\") " pod="openstack/barbican-01e5-account-create-update-ptjxt" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.253383 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zxh5l" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.348065 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0055-account-create-update-jxb6l" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.366138 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-26d4d" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.395230 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-01e5-account-create-update-ptjxt" Mar 21 05:11:53 crc kubenswrapper[4580]: I0321 05:11:53.587508 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-smbzn"] Mar 21 05:11:54 crc kubenswrapper[4580]: I0321 05:11:53.795705 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-kkck5"] Mar 21 05:11:54 crc kubenswrapper[4580]: I0321 05:11:53.820175 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-38bf-account-create-update-pwp74"] Mar 21 05:11:54 crc kubenswrapper[4580]: I0321 05:11:54.060827 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-kkck5" event={"ID":"949065da-d504-4d31-800b-cfb6b82bb559","Type":"ContainerStarted","Data":"05237fba639f5471d66275a17fc51decfe9e9c5944d0cd65f44b60a5f33bf9cf"} Mar 21 05:11:54 crc kubenswrapper[4580]: I0321 05:11:54.061265 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-kkck5" event={"ID":"949065da-d504-4d31-800b-cfb6b82bb559","Type":"ContainerStarted","Data":"79dfd67aa817563e9d9a314e85decce79422f23d8d2bd05209bb884881f4510f"} Mar 21 05:11:54 crc kubenswrapper[4580]: I0321 05:11:54.065774 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-38bf-account-create-update-pwp74" event={"ID":"3b154885-0b18-4926-b81c-c10208075c27","Type":"ContainerStarted","Data":"adb554f597721ec1d86a12646755cb046493586305cebd9ba9c001ca76daeef1"} Mar 21 05:11:54 crc kubenswrapper[4580]: I0321 05:11:54.073732 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-smbzn" event={"ID":"8e111d88-8363-418a-aa16-e4738fc5dfa5","Type":"ContainerStarted","Data":"28fe3b05e2cf1e604a0b3a0b38c1d58bf2380cd6082055556dd6d4c3f3bbdb53"} Mar 21 05:11:54 crc kubenswrapper[4580]: I0321 05:11:54.073788 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-smbzn" event={"ID":"8e111d88-8363-418a-aa16-e4738fc5dfa5","Type":"ContainerStarted","Data":"77994f4cf2e302d28fe8990ab277ca402d83746ff3219b19a1362bc4c37c4820"} Mar 21 05:11:54 crc kubenswrapper[4580]: I0321 05:11:54.098996 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-smbzn" podStartSLOduration=2.098972946 podStartE2EDuration="2.098972946s" podCreationTimestamp="2026-03-21 05:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:11:54.092919585 +0000 UTC m=+1219.175503233" watchObservedRunningTime="2026-03-21 05:11:54.098972946 +0000 UTC m=+1219.181556574" Mar 21 05:11:54 crc kubenswrapper[4580]: I0321 05:11:54.834325 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zxh5l"] Mar 21 05:11:54 crc kubenswrapper[4580]: W0321 05:11:54.843149 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d29c081_c37a_46ac_8354_178685882ce2.slice/crio-1b11403c957c270585d411e8f101df1fc9966dadd2b2443b03ab8544f523463f WatchSource:0}: Error finding container 1b11403c957c270585d411e8f101df1fc9966dadd2b2443b03ab8544f523463f: Status 404 returned error can't find the container with id 1b11403c957c270585d411e8f101df1fc9966dadd2b2443b03ab8544f523463f Mar 21 05:11:54 crc kubenswrapper[4580]: I0321 05:11:54.933065 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-26d4d"] Mar 21 05:11:54 crc kubenswrapper[4580]: W0321 05:11:54.940367 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ec630d4_9254_4a93_b61f_d960ac2b3ccc.slice/crio-6a326addd194b4446355eab17809560b50c4d8ea2184b1ec9f3c151e1bb9328b WatchSource:0}: Error finding container 6a326addd194b4446355eab17809560b50c4d8ea2184b1ec9f3c151e1bb9328b: Status 404 returned error can't find the container with id 6a326addd194b4446355eab17809560b50c4d8ea2184b1ec9f3c151e1bb9328b Mar 21 05:11:54 crc kubenswrapper[4580]: I0321 05:11:54.958118 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-01e5-account-create-update-ptjxt"] Mar 21 05:11:54 crc kubenswrapper[4580]: I0321 05:11:54.970212 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0055-account-create-update-jxb6l"] Mar 21 05:11:55 crc kubenswrapper[4580]: I0321 05:11:55.137600 4580 generic.go:334] "Generic (PLEG): container finished" podID="3b154885-0b18-4926-b81c-c10208075c27" containerID="0ca53a7f97daaabc25f3bf440d40fc5b8f4aa72da12926007c9f913f5c361660" exitCode=0 Mar 21 05:11:55 crc kubenswrapper[4580]: I0321 05:11:55.137693 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-38bf-account-create-update-pwp74" event={"ID":"3b154885-0b18-4926-b81c-c10208075c27","Type":"ContainerDied","Data":"0ca53a7f97daaabc25f3bf440d40fc5b8f4aa72da12926007c9f913f5c361660"} Mar 21 05:11:55 crc kubenswrapper[4580]: I0321 05:11:55.144468 4580 generic.go:334] "Generic (PLEG): container finished" podID="8e111d88-8363-418a-aa16-e4738fc5dfa5" containerID="28fe3b05e2cf1e604a0b3a0b38c1d58bf2380cd6082055556dd6d4c3f3bbdb53" exitCode=0 Mar 21 05:11:55 crc kubenswrapper[4580]: I0321 05:11:55.144549 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-smbzn" event={"ID":"8e111d88-8363-418a-aa16-e4738fc5dfa5","Type":"ContainerDied","Data":"28fe3b05e2cf1e604a0b3a0b38c1d58bf2380cd6082055556dd6d4c3f3bbdb53"} Mar 21 05:11:55 crc kubenswrapper[4580]: I0321 05:11:55.147937 4580 generic.go:334] "Generic (PLEG): container finished" podID="949065da-d504-4d31-800b-cfb6b82bb559" containerID="05237fba639f5471d66275a17fc51decfe9e9c5944d0cd65f44b60a5f33bf9cf" exitCode=0 Mar 21 05:11:55 crc kubenswrapper[4580]: I0321 05:11:55.148009 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-kkck5" event={"ID":"949065da-d504-4d31-800b-cfb6b82bb559","Type":"ContainerDied","Data":"05237fba639f5471d66275a17fc51decfe9e9c5944d0cd65f44b60a5f33bf9cf"} Mar 21 05:11:55 crc kubenswrapper[4580]: I0321 05:11:55.155375 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-01e5-account-create-update-ptjxt" event={"ID":"8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33","Type":"ContainerStarted","Data":"88578d242de7b4126d61f50a1470ab17ecb3114c7bf4d54814dc2375bcafd55b"} Mar 21 05:11:55 crc kubenswrapper[4580]: I0321 05:11:55.158734 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0055-account-create-update-jxb6l" event={"ID":"486431e9-d00c-4a61-b831-619c73ef470f","Type":"ContainerStarted","Data":"83bf6a5bbf6774f91c173c10dbbff0db0a8d8730332d45c6620eca18a6a0ac23"} Mar 21 05:11:55 crc kubenswrapper[4580]: I0321 05:11:55.161156 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zxh5l" event={"ID":"0d29c081-c37a-46ac-8354-178685882ce2","Type":"ContainerStarted","Data":"1b11403c957c270585d411e8f101df1fc9966dadd2b2443b03ab8544f523463f"} Mar 21 05:11:55 crc kubenswrapper[4580]: I0321 05:11:55.164724 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-26d4d" event={"ID":"1ec630d4-9254-4a93-b61f-d960ac2b3ccc","Type":"ContainerStarted","Data":"6a326addd194b4446355eab17809560b50c4d8ea2184b1ec9f3c151e1bb9328b"} Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.176467 4580 generic.go:334] "Generic (PLEG): container finished" podID="8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33" containerID="da08ecf9ae625bdbd7251b180ce869bbfb2690a1bde5bb47b6452d0382dfd919" exitCode=0 Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.176528 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-01e5-account-create-update-ptjxt" event={"ID":"8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33","Type":"ContainerDied","Data":"da08ecf9ae625bdbd7251b180ce869bbfb2690a1bde5bb47b6452d0382dfd919"} Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.184163 4580 generic.go:334] "Generic (PLEG): container finished" podID="486431e9-d00c-4a61-b831-619c73ef470f" containerID="2b29bf2ce9732a549b39900bc56fa49051a6e8c505c055ac216f900b7fce97c3" exitCode=0 Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.184284 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0055-account-create-update-jxb6l" event={"ID":"486431e9-d00c-4a61-b831-619c73ef470f","Type":"ContainerDied","Data":"2b29bf2ce9732a549b39900bc56fa49051a6e8c505c055ac216f900b7fce97c3"} Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.185977 4580 generic.go:334] "Generic (PLEG): container finished" podID="1ec630d4-9254-4a93-b61f-d960ac2b3ccc" containerID="6e80d1c51411e1a99aac75aeae886dd19082c3fdfd0ed8fb3896665f03fcc330" exitCode=0 Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.186201 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-26d4d" event={"ID":"1ec630d4-9254-4a93-b61f-d960ac2b3ccc","Type":"ContainerDied","Data":"6e80d1c51411e1a99aac75aeae886dd19082c3fdfd0ed8fb3896665f03fcc330"} Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.782687 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-kkck5" Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.789287 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-38bf-account-create-update-pwp74" Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.818260 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-smbzn" Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.891081 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949065da-d504-4d31-800b-cfb6b82bb559-operator-scripts\") pod \"949065da-d504-4d31-800b-cfb6b82bb559\" (UID: \"949065da-d504-4d31-800b-cfb6b82bb559\") " Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.891629 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b67m\" (UniqueName: \"kubernetes.io/projected/949065da-d504-4d31-800b-cfb6b82bb559-kube-api-access-8b67m\") pod \"949065da-d504-4d31-800b-cfb6b82bb559\" (UID: \"949065da-d504-4d31-800b-cfb6b82bb559\") " Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.891781 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtx6c\" (UniqueName: \"kubernetes.io/projected/3b154885-0b18-4926-b81c-c10208075c27-kube-api-access-rtx6c\") pod \"3b154885-0b18-4926-b81c-c10208075c27\" (UID: \"3b154885-0b18-4926-b81c-c10208075c27\") " Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.891892 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b154885-0b18-4926-b81c-c10208075c27-operator-scripts\") pod \"3b154885-0b18-4926-b81c-c10208075c27\" (UID: \"3b154885-0b18-4926-b81c-c10208075c27\") " Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.892186 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949065da-d504-4d31-800b-cfb6b82bb559-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "949065da-d504-4d31-800b-cfb6b82bb559" (UID: "949065da-d504-4d31-800b-cfb6b82bb559"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.892739 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b154885-0b18-4926-b81c-c10208075c27-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b154885-0b18-4926-b81c-c10208075c27" (UID: "3b154885-0b18-4926-b81c-c10208075c27"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.892761 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949065da-d504-4d31-800b-cfb6b82bb559-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.903456 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949065da-d504-4d31-800b-cfb6b82bb559-kube-api-access-8b67m" (OuterVolumeSpecName: "kube-api-access-8b67m") pod "949065da-d504-4d31-800b-cfb6b82bb559" (UID: "949065da-d504-4d31-800b-cfb6b82bb559"). InnerVolumeSpecName "kube-api-access-8b67m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.904196 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b154885-0b18-4926-b81c-c10208075c27-kube-api-access-rtx6c" (OuterVolumeSpecName: "kube-api-access-rtx6c") pod "3b154885-0b18-4926-b81c-c10208075c27" (UID: "3b154885-0b18-4926-b81c-c10208075c27"). InnerVolumeSpecName "kube-api-access-rtx6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.993670 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e111d88-8363-418a-aa16-e4738fc5dfa5-operator-scripts\") pod \"8e111d88-8363-418a-aa16-e4738fc5dfa5\" (UID: \"8e111d88-8363-418a-aa16-e4738fc5dfa5\") " Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.993914 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd7s5\" (UniqueName: \"kubernetes.io/projected/8e111d88-8363-418a-aa16-e4738fc5dfa5-kube-api-access-vd7s5\") pod \"8e111d88-8363-418a-aa16-e4738fc5dfa5\" (UID: \"8e111d88-8363-418a-aa16-e4738fc5dfa5\") " Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.994198 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e111d88-8363-418a-aa16-e4738fc5dfa5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e111d88-8363-418a-aa16-e4738fc5dfa5" (UID: "8e111d88-8363-418a-aa16-e4738fc5dfa5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.995346 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e111d88-8363-418a-aa16-e4738fc5dfa5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.995390 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b67m\" (UniqueName: \"kubernetes.io/projected/949065da-d504-4d31-800b-cfb6b82bb559-kube-api-access-8b67m\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.995406 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtx6c\" (UniqueName: \"kubernetes.io/projected/3b154885-0b18-4926-b81c-c10208075c27-kube-api-access-rtx6c\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.995417 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b154885-0b18-4926-b81c-c10208075c27-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:56 crc kubenswrapper[4580]: I0321 05:11:56.998205 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e111d88-8363-418a-aa16-e4738fc5dfa5-kube-api-access-vd7s5" (OuterVolumeSpecName: "kube-api-access-vd7s5") pod "8e111d88-8363-418a-aa16-e4738fc5dfa5" (UID: "8e111d88-8363-418a-aa16-e4738fc5dfa5"). InnerVolumeSpecName "kube-api-access-vd7s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.097775 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd7s5\" (UniqueName: \"kubernetes.io/projected/8e111d88-8363-418a-aa16-e4738fc5dfa5-kube-api-access-vd7s5\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.195918 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-38bf-account-create-update-pwp74" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.195909 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-38bf-account-create-update-pwp74" event={"ID":"3b154885-0b18-4926-b81c-c10208075c27","Type":"ContainerDied","Data":"adb554f597721ec1d86a12646755cb046493586305cebd9ba9c001ca76daeef1"} Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.196068 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adb554f597721ec1d86a12646755cb046493586305cebd9ba9c001ca76daeef1" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.198267 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-smbzn" event={"ID":"8e111d88-8363-418a-aa16-e4738fc5dfa5","Type":"ContainerDied","Data":"77994f4cf2e302d28fe8990ab277ca402d83746ff3219b19a1362bc4c37c4820"} Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.198303 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77994f4cf2e302d28fe8990ab277ca402d83746ff3219b19a1362bc4c37c4820" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.198306 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-smbzn" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.199781 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-kkck5" event={"ID":"949065da-d504-4d31-800b-cfb6b82bb559","Type":"ContainerDied","Data":"79dfd67aa817563e9d9a314e85decce79422f23d8d2bd05209bb884881f4510f"} Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.199856 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-kkck5" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.199872 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79dfd67aa817563e9d9a314e85decce79422f23d8d2bd05209bb884881f4510f" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.574154 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-01e5-account-create-update-ptjxt" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.635181 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-26d4d" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.708873 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33-operator-scripts\") pod \"8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33\" (UID: \"8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33\") " Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.708966 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvr24\" (UniqueName: \"kubernetes.io/projected/1ec630d4-9254-4a93-b61f-d960ac2b3ccc-kube-api-access-rvr24\") pod \"1ec630d4-9254-4a93-b61f-d960ac2b3ccc\" (UID: \"1ec630d4-9254-4a93-b61f-d960ac2b3ccc\") " Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.709010 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv5g4\" (UniqueName: \"kubernetes.io/projected/8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33-kube-api-access-qv5g4\") pod \"8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33\" (UID: \"8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33\") " Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.709194 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec630d4-9254-4a93-b61f-d960ac2b3ccc-operator-scripts\") pod \"1ec630d4-9254-4a93-b61f-d960ac2b3ccc\" (UID: \"1ec630d4-9254-4a93-b61f-d960ac2b3ccc\") " Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.711802 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ec630d4-9254-4a93-b61f-d960ac2b3ccc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ec630d4-9254-4a93-b61f-d960ac2b3ccc" (UID: "1ec630d4-9254-4a93-b61f-d960ac2b3ccc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.712460 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33" (UID: "8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.716414 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33-kube-api-access-qv5g4" (OuterVolumeSpecName: "kube-api-access-qv5g4") pod "8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33" (UID: "8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33"). InnerVolumeSpecName "kube-api-access-qv5g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.717626 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec630d4-9254-4a93-b61f-d960ac2b3ccc-kube-api-access-rvr24" (OuterVolumeSpecName: "kube-api-access-rvr24") pod "1ec630d4-9254-4a93-b61f-d960ac2b3ccc" (UID: "1ec630d4-9254-4a93-b61f-d960ac2b3ccc"). InnerVolumeSpecName "kube-api-access-rvr24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.801767 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0055-account-create-update-jxb6l" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.811765 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.811833 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvr24\" (UniqueName: \"kubernetes.io/projected/1ec630d4-9254-4a93-b61f-d960ac2b3ccc-kube-api-access-rvr24\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.811851 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv5g4\" (UniqueName: \"kubernetes.io/projected/8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33-kube-api-access-qv5g4\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.811863 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec630d4-9254-4a93-b61f-d960ac2b3ccc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.912527 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486431e9-d00c-4a61-b831-619c73ef470f-operator-scripts\") pod \"486431e9-d00c-4a61-b831-619c73ef470f\" (UID: \"486431e9-d00c-4a61-b831-619c73ef470f\") " Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.912588 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s9v5\" (UniqueName: \"kubernetes.io/projected/486431e9-d00c-4a61-b831-619c73ef470f-kube-api-access-6s9v5\") pod \"486431e9-d00c-4a61-b831-619c73ef470f\" (UID: \"486431e9-d00c-4a61-b831-619c73ef470f\") " Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.913693 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/486431e9-d00c-4a61-b831-619c73ef470f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "486431e9-d00c-4a61-b831-619c73ef470f" (UID: "486431e9-d00c-4a61-b831-619c73ef470f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:57 crc kubenswrapper[4580]: I0321 05:11:57.920202 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486431e9-d00c-4a61-b831-619c73ef470f-kube-api-access-6s9v5" (OuterVolumeSpecName: "kube-api-access-6s9v5") pod "486431e9-d00c-4a61-b831-619c73ef470f" (UID: "486431e9-d00c-4a61-b831-619c73ef470f"). InnerVolumeSpecName "kube-api-access-6s9v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:58 crc kubenswrapper[4580]: I0321 05:11:58.014384 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486431e9-d00c-4a61-b831-619c73ef470f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:58 crc kubenswrapper[4580]: I0321 05:11:58.014423 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s9v5\" (UniqueName: \"kubernetes.io/projected/486431e9-d00c-4a61-b831-619c73ef470f-kube-api-access-6s9v5\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:58 crc kubenswrapper[4580]: I0321 05:11:58.210867 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-01e5-account-create-update-ptjxt" Mar 21 05:11:58 crc kubenswrapper[4580]: I0321 05:11:58.210856 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-01e5-account-create-update-ptjxt" event={"ID":"8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33","Type":"ContainerDied","Data":"88578d242de7b4126d61f50a1470ab17ecb3114c7bf4d54814dc2375bcafd55b"} Mar 21 05:11:58 crc kubenswrapper[4580]: I0321 05:11:58.210996 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88578d242de7b4126d61f50a1470ab17ecb3114c7bf4d54814dc2375bcafd55b" Mar 21 05:11:58 crc kubenswrapper[4580]: I0321 05:11:58.214388 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0055-account-create-update-jxb6l" event={"ID":"486431e9-d00c-4a61-b831-619c73ef470f","Type":"ContainerDied","Data":"83bf6a5bbf6774f91c173c10dbbff0db0a8d8730332d45c6620eca18a6a0ac23"} Mar 21 05:11:58 crc kubenswrapper[4580]: I0321 05:11:58.214422 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0055-account-create-update-jxb6l" Mar 21 05:11:58 crc kubenswrapper[4580]: I0321 05:11:58.214426 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83bf6a5bbf6774f91c173c10dbbff0db0a8d8730332d45c6620eca18a6a0ac23" Mar 21 05:11:58 crc kubenswrapper[4580]: I0321 05:11:58.215698 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-26d4d" event={"ID":"1ec630d4-9254-4a93-b61f-d960ac2b3ccc","Type":"ContainerDied","Data":"6a326addd194b4446355eab17809560b50c4d8ea2184b1ec9f3c151e1bb9328b"} Mar 21 05:11:58 crc kubenswrapper[4580]: I0321 05:11:58.215723 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a326addd194b4446355eab17809560b50c4d8ea2184b1ec9f3c151e1bb9328b" Mar 21 05:11:58 crc kubenswrapper[4580]: I0321 05:11:58.215772 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-26d4d" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.139701 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567832-4mr87"] Mar 21 05:12:00 crc kubenswrapper[4580]: E0321 05:12:00.142362 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b154885-0b18-4926-b81c-c10208075c27" containerName="mariadb-account-create-update" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.142392 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b154885-0b18-4926-b81c-c10208075c27" containerName="mariadb-account-create-update" Mar 21 05:12:00 crc kubenswrapper[4580]: E0321 05:12:00.142412 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e111d88-8363-418a-aa16-e4738fc5dfa5" containerName="mariadb-database-create" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.142421 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e111d88-8363-418a-aa16-e4738fc5dfa5" containerName="mariadb-database-create" Mar 21 05:12:00 crc kubenswrapper[4580]: E0321 05:12:00.142437 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486431e9-d00c-4a61-b831-619c73ef470f" containerName="mariadb-account-create-update" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.142445 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="486431e9-d00c-4a61-b831-619c73ef470f" containerName="mariadb-account-create-update" Mar 21 05:12:00 crc kubenswrapper[4580]: E0321 05:12:00.142458 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33" containerName="mariadb-account-create-update" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.142466 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33" containerName="mariadb-account-create-update" Mar 21 05:12:00 crc kubenswrapper[4580]: E0321 05:12:00.142481 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949065da-d504-4d31-800b-cfb6b82bb559" containerName="mariadb-database-create" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.142488 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="949065da-d504-4d31-800b-cfb6b82bb559" containerName="mariadb-database-create" Mar 21 05:12:00 crc kubenswrapper[4580]: E0321 05:12:00.142497 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec630d4-9254-4a93-b61f-d960ac2b3ccc" containerName="mariadb-database-create" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.142505 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec630d4-9254-4a93-b61f-d960ac2b3ccc" containerName="mariadb-database-create" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.142692 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e111d88-8363-418a-aa16-e4738fc5dfa5" containerName="mariadb-database-create" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.142707 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="486431e9-d00c-4a61-b831-619c73ef470f" containerName="mariadb-account-create-update" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.142721 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec630d4-9254-4a93-b61f-d960ac2b3ccc" containerName="mariadb-database-create" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.142733 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33" containerName="mariadb-account-create-update" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.142745 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="949065da-d504-4d31-800b-cfb6b82bb559" containerName="mariadb-database-create" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.142755 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b154885-0b18-4926-b81c-c10208075c27" containerName="mariadb-account-create-update" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.143424 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-4mr87" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.146655 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.146892 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.147011 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.151713 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-4mr87"] Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.279283 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w46x\" (UniqueName: \"kubernetes.io/projected/567de937-7644-4d4f-a759-f74e483dfe3d-kube-api-access-7w46x\") pod \"auto-csr-approver-29567832-4mr87\" (UID: \"567de937-7644-4d4f-a759-f74e483dfe3d\") " pod="openshift-infra/auto-csr-approver-29567832-4mr87" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.381579 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w46x\" (UniqueName: \"kubernetes.io/projected/567de937-7644-4d4f-a759-f74e483dfe3d-kube-api-access-7w46x\") pod \"auto-csr-approver-29567832-4mr87\" (UID: \"567de937-7644-4d4f-a759-f74e483dfe3d\") " pod="openshift-infra/auto-csr-approver-29567832-4mr87" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.406920 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w46x\" (UniqueName: \"kubernetes.io/projected/567de937-7644-4d4f-a759-f74e483dfe3d-kube-api-access-7w46x\") pod \"auto-csr-approver-29567832-4mr87\" (UID: \"567de937-7644-4d4f-a759-f74e483dfe3d\") " pod="openshift-infra/auto-csr-approver-29567832-4mr87" Mar 21 05:12:00 crc kubenswrapper[4580]: I0321 05:12:00.478505 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-4mr87" Mar 21 05:12:01 crc kubenswrapper[4580]: I0321 05:12:01.096617 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:12:01 crc kubenswrapper[4580]: I0321 05:12:01.103167 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d59ab798-9ae9-4f47-b58b-36417592eef2-etc-swift\") pod \"swift-storage-0\" (UID: \"d59ab798-9ae9-4f47-b58b-36417592eef2\") " pod="openstack/swift-storage-0" Mar 21 05:12:01 crc kubenswrapper[4580]: I0321 05:12:01.234569 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 21 05:12:11 crc kubenswrapper[4580]: E0321 05:12:11.762597 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Mar 21 05:12:11 crc kubenswrapper[4580]: E0321 05:12:11.763233 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h95zr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-n4vzq_openstack(f86c26eb-bb22-460c-8a45-191a02924112): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:12:11 crc kubenswrapper[4580]: E0321 05:12:11.765220 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-n4vzq" podUID="f86c26eb-bb22-460c-8a45-191a02924112" Mar 21 05:12:12 crc kubenswrapper[4580]: I0321 05:12:12.349262 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-4mr87"] Mar 21 05:12:12 crc kubenswrapper[4580]: I0321 05:12:12.353668 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zxh5l" event={"ID":"0d29c081-c37a-46ac-8354-178685882ce2","Type":"ContainerStarted","Data":"2feacbdc69c1b37fcb3920f7fd6f4520a27eed9b90712074b5be2625073e72b4"} Mar 21 05:12:12 crc kubenswrapper[4580]: E0321 05:12:12.355494 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-n4vzq" podUID="f86c26eb-bb22-460c-8a45-191a02924112" Mar 21 05:12:12 crc kubenswrapper[4580]: I0321 05:12:12.369028 4580 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:12:12 crc kubenswrapper[4580]: I0321 05:12:12.401973 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-zxh5l" podStartSLOduration=3.461159975 podStartE2EDuration="20.401953998s" podCreationTimestamp="2026-03-21 05:11:52 +0000 UTC" firstStartedPulling="2026-03-21 05:11:54.849522212 +0000 UTC m=+1219.932105840" lastFinishedPulling="2026-03-21 05:12:11.790316235 +0000 UTC m=+1236.872899863" observedRunningTime="2026-03-21 05:12:12.397632743 +0000 UTC m=+1237.480216381" watchObservedRunningTime="2026-03-21 05:12:12.401953998 +0000 UTC m=+1237.484537636" Mar 21 05:12:12 crc kubenswrapper[4580]: I0321 05:12:12.423626 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 21 05:12:13 crc kubenswrapper[4580]: I0321 05:12:13.367334 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567832-4mr87" event={"ID":"567de937-7644-4d4f-a759-f74e483dfe3d","Type":"ContainerStarted","Data":"56770ffdee56a6be69b575f8110cdb217e579d44d0458b0059f898ee3c50e4b8"} Mar 21 05:12:13 crc kubenswrapper[4580]: I0321 05:12:13.369923 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"c5945b8c47b86ddbd273febcf9365372457f211c3811fc8e9b0429cdaa292947"} Mar 21 05:12:14 crc kubenswrapper[4580]: I0321 05:12:14.378215 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567832-4mr87" event={"ID":"567de937-7644-4d4f-a759-f74e483dfe3d","Type":"ContainerStarted","Data":"5cff183c9eab864165e01745bb922f30eb4c972ab49659f258461f0209a9ced5"} Mar 21 05:12:14 crc kubenswrapper[4580]: I0321 05:12:14.386027 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"73917c3bf05369243a37ad7619071badf2cb3cbbdba7bd737cfeb69f0350ab51"} Mar 21 05:12:14 crc kubenswrapper[4580]: I0321 05:12:14.386068 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"6bf278265c71dc61fdc8f182613c8af8cb320465e930bedf68401b3a06b38e2f"} Mar 21 05:12:14 crc kubenswrapper[4580]: I0321 05:12:14.386080 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"f02e5fbf4944b5abb7c69f58129355a3dd3f1b38a36000a238fc4e1ef3b53e26"} Mar 21 05:12:14 crc kubenswrapper[4580]: I0321 05:12:14.404325 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567832-4mr87" podStartSLOduration=13.059526643 podStartE2EDuration="14.40430329s" podCreationTimestamp="2026-03-21 05:12:00 +0000 UTC" firstStartedPulling="2026-03-21 05:12:12.368823077 +0000 UTC m=+1237.451406705" lastFinishedPulling="2026-03-21 05:12:13.713599724 +0000 UTC m=+1238.796183352" observedRunningTime="2026-03-21 05:12:14.397447677 +0000 UTC m=+1239.480031305" watchObservedRunningTime="2026-03-21 05:12:14.40430329 +0000 UTC m=+1239.486886918" Mar 21 05:12:15 crc kubenswrapper[4580]: I0321 05:12:15.395845 4580 generic.go:334] "Generic (PLEG): container finished" podID="567de937-7644-4d4f-a759-f74e483dfe3d" containerID="5cff183c9eab864165e01745bb922f30eb4c972ab49659f258461f0209a9ced5" exitCode=0 Mar 21 05:12:15 crc kubenswrapper[4580]: I0321 05:12:15.396174 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567832-4mr87" event={"ID":"567de937-7644-4d4f-a759-f74e483dfe3d","Type":"ContainerDied","Data":"5cff183c9eab864165e01745bb922f30eb4c972ab49659f258461f0209a9ced5"} Mar 21 05:12:15 crc kubenswrapper[4580]: I0321 05:12:15.399277 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"b5a8cdd32a30ba40d4cae614ff0dfdfd3ac3a5c89cd3a3c17c78900259ebf80c"} Mar 21 05:12:16 crc kubenswrapper[4580]: I0321 05:12:16.414056 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"66402f2f377179bd2982886cc30ee53a9e860f205c0dd89ee80520c13a64949d"} Mar 21 05:12:16 crc kubenswrapper[4580]: I0321 05:12:16.665223 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-4mr87" Mar 21 05:12:16 crc kubenswrapper[4580]: I0321 05:12:16.707047 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w46x\" (UniqueName: \"kubernetes.io/projected/567de937-7644-4d4f-a759-f74e483dfe3d-kube-api-access-7w46x\") pod \"567de937-7644-4d4f-a759-f74e483dfe3d\" (UID: \"567de937-7644-4d4f-a759-f74e483dfe3d\") " Mar 21 05:12:16 crc kubenswrapper[4580]: I0321 05:12:16.712028 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/567de937-7644-4d4f-a759-f74e483dfe3d-kube-api-access-7w46x" (OuterVolumeSpecName: "kube-api-access-7w46x") pod "567de937-7644-4d4f-a759-f74e483dfe3d" (UID: "567de937-7644-4d4f-a759-f74e483dfe3d"). InnerVolumeSpecName "kube-api-access-7w46x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:12:16 crc kubenswrapper[4580]: I0321 05:12:16.809014 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w46x\" (UniqueName: \"kubernetes.io/projected/567de937-7644-4d4f-a759-f74e483dfe3d-kube-api-access-7w46x\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:17 crc kubenswrapper[4580]: I0321 05:12:17.452996 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"4e1e0888b6f2da024ec0e56cc2a3d9816c84eb4a6476bfead699fd7f4295237a"} Mar 21 05:12:17 crc kubenswrapper[4580]: I0321 05:12:17.453039 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"87eaaceeb9faa6e453144872edd6b48bca5291f3cc9d033af88879444525c0f3"} Mar 21 05:12:17 crc kubenswrapper[4580]: I0321 05:12:17.453049 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"0b07066fd7dde4143382baca79ebf9dbe72aa7accfc3f1f4c1035d25424be58e"} Mar 21 05:12:17 crc kubenswrapper[4580]: I0321 05:12:17.455471 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567832-4mr87" event={"ID":"567de937-7644-4d4f-a759-f74e483dfe3d","Type":"ContainerDied","Data":"56770ffdee56a6be69b575f8110cdb217e579d44d0458b0059f898ee3c50e4b8"} Mar 21 05:12:17 crc kubenswrapper[4580]: I0321 05:12:17.455498 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56770ffdee56a6be69b575f8110cdb217e579d44d0458b0059f898ee3c50e4b8" Mar 21 05:12:17 crc kubenswrapper[4580]: I0321 05:12:17.460936 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-4mr87" Mar 21 05:12:17 crc kubenswrapper[4580]: I0321 05:12:17.470484 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-5svml"] Mar 21 05:12:17 crc kubenswrapper[4580]: I0321 05:12:17.481997 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-5svml"] Mar 21 05:12:17 crc kubenswrapper[4580]: I0321 05:12:17.632295 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53dd2f4a-1fb1-43a5-b164-8de12304064a" path="/var/lib/kubelet/pods/53dd2f4a-1fb1-43a5-b164-8de12304064a/volumes" Mar 21 05:12:18 crc kubenswrapper[4580]: I0321 05:12:18.469036 4580 generic.go:334] "Generic (PLEG): container finished" podID="0d29c081-c37a-46ac-8354-178685882ce2" containerID="2feacbdc69c1b37fcb3920f7fd6f4520a27eed9b90712074b5be2625073e72b4" exitCode=0 Mar 21 05:12:18 crc kubenswrapper[4580]: I0321 05:12:18.469179 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zxh5l" event={"ID":"0d29c081-c37a-46ac-8354-178685882ce2","Type":"ContainerDied","Data":"2feacbdc69c1b37fcb3920f7fd6f4520a27eed9b90712074b5be2625073e72b4"} Mar 21 05:12:18 crc kubenswrapper[4580]: I0321 05:12:18.479516 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"7ee245e490ed85bc7d623513618b3dbb7691f7725b25adfd0a77fe7bc4fd2980"} Mar 21 05:12:18 crc kubenswrapper[4580]: I0321 05:12:18.479564 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"e09bc7740908a9728155ef08067faaa90b94243497476117814f3581f18d2710"} Mar 21 05:12:18 crc kubenswrapper[4580]: I0321 05:12:18.479577 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"5ebfdc24512b8e5d0216c2d81f22c133158fe5ee9249e687e12b7c79e23cf161"} Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.502896 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"566a45720a515be8ae7139a081b2f759a7bc9b5de3f46aee83791ca1d6a6f562"} Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.503446 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"4234408d381c4a80aa5422f2eadb19499373473edd82379bb6b916f444e5874b"} Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.503470 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"d4799889cb05dacc123c8c6527842a8a043827b54bd6def599f6ed819f815a53"} Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.503485 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d59ab798-9ae9-4f47-b58b-36417592eef2","Type":"ContainerStarted","Data":"16a11ec06bb8bfa686343b89510783339a558a8eb6ff199048cedb70a19b7974"} Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.545368 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=46.154791935 podStartE2EDuration="51.545350207s" podCreationTimestamp="2026-03-21 05:11:28 +0000 UTC" firstStartedPulling="2026-03-21 05:12:12.439415074 +0000 UTC m=+1237.521998702" lastFinishedPulling="2026-03-21 05:12:17.829973346 +0000 UTC m=+1242.912556974" observedRunningTime="2026-03-21 05:12:19.541643848 +0000 UTC m=+1244.624227476" watchObservedRunningTime="2026-03-21 05:12:19.545350207 +0000 UTC m=+1244.627933835" Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.815284 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zxh5l" Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.906854 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v4rvs"] Mar 21 05:12:19 crc kubenswrapper[4580]: E0321 05:12:19.907282 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d29c081-c37a-46ac-8354-178685882ce2" containerName="keystone-db-sync" Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.907307 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d29c081-c37a-46ac-8354-178685882ce2" containerName="keystone-db-sync" Mar 21 05:12:19 crc kubenswrapper[4580]: E0321 05:12:19.907335 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="567de937-7644-4d4f-a759-f74e483dfe3d" containerName="oc" Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.907343 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="567de937-7644-4d4f-a759-f74e483dfe3d" containerName="oc" Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.907526 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="567de937-7644-4d4f-a759-f74e483dfe3d" containerName="oc" Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.907550 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d29c081-c37a-46ac-8354-178685882ce2" containerName="keystone-db-sync" Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.908813 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.912898 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.919959 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v4rvs"] Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.966678 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d29c081-c37a-46ac-8354-178685882ce2-combined-ca-bundle\") pod \"0d29c081-c37a-46ac-8354-178685882ce2\" (UID: \"0d29c081-c37a-46ac-8354-178685882ce2\") " Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.966766 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d29c081-c37a-46ac-8354-178685882ce2-config-data\") pod \"0d29c081-c37a-46ac-8354-178685882ce2\" (UID: \"0d29c081-c37a-46ac-8354-178685882ce2\") " Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.966873 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79nmt\" (UniqueName: \"kubernetes.io/projected/0d29c081-c37a-46ac-8354-178685882ce2-kube-api-access-79nmt\") pod \"0d29c081-c37a-46ac-8354-178685882ce2\" (UID: \"0d29c081-c37a-46ac-8354-178685882ce2\") " Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.984560 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d29c081-c37a-46ac-8354-178685882ce2-kube-api-access-79nmt" (OuterVolumeSpecName: "kube-api-access-79nmt") pod "0d29c081-c37a-46ac-8354-178685882ce2" (UID: "0d29c081-c37a-46ac-8354-178685882ce2"). InnerVolumeSpecName "kube-api-access-79nmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:12:19 crc kubenswrapper[4580]: I0321 05:12:19.994604 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d29c081-c37a-46ac-8354-178685882ce2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d29c081-c37a-46ac-8354-178685882ce2" (UID: "0d29c081-c37a-46ac-8354-178685882ce2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.040263 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d29c081-c37a-46ac-8354-178685882ce2-config-data" (OuterVolumeSpecName: "config-data") pod "0d29c081-c37a-46ac-8354-178685882ce2" (UID: "0d29c081-c37a-46ac-8354-178685882ce2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.069079 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-config\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.069347 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-dns-svc\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.069376 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9qxn\" (UniqueName: \"kubernetes.io/projected/83aac843-d8d4-4ad6-ba6a-6139d0197002-kube-api-access-b9qxn\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.069415 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.069442 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.069572 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.069762 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d29c081-c37a-46ac-8354-178685882ce2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.069855 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d29c081-c37a-46ac-8354-178685882ce2-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.069872 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79nmt\" (UniqueName: \"kubernetes.io/projected/0d29c081-c37a-46ac-8354-178685882ce2-kube-api-access-79nmt\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.171032 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-config\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.171285 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-dns-svc\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.171374 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9qxn\" (UniqueName: \"kubernetes.io/projected/83aac843-d8d4-4ad6-ba6a-6139d0197002-kube-api-access-b9qxn\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.171449 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.171520 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.171597 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.172432 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.172666 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-config\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.173192 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.173432 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.173948 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-dns-svc\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.187458 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9qxn\" (UniqueName: \"kubernetes.io/projected/83aac843-d8d4-4ad6-ba6a-6139d0197002-kube-api-access-b9qxn\") pod \"dnsmasq-dns-764c5664d7-v4rvs\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.227189 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.516889 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zxh5l" event={"ID":"0d29c081-c37a-46ac-8354-178685882ce2","Type":"ContainerDied","Data":"1b11403c957c270585d411e8f101df1fc9966dadd2b2443b03ab8544f523463f"} Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.516959 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b11403c957c270585d411e8f101df1fc9966dadd2b2443b03ab8544f523463f" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.516925 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zxh5l" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.708544 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v4rvs"] Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.794635 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v4rvs"] Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.853689 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rrz6s"] Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.854946 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.860248 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.860424 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sqx2b" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.860537 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.860659 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.866732 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rrz6s"] Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.871450 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.902735 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-f4kld"] Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.904047 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.917661 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-f4kld"] Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.986756 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-fernet-keys\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.986875 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwlg6\" (UniqueName: \"kubernetes.io/projected/27996094-c657-451a-98fe-b960c1b88d31-kube-api-access-gwlg6\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.986921 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-scripts\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.986961 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-config-data\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.986977 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-credential-keys\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:20 crc kubenswrapper[4580]: I0321 05:12:20.987001 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-combined-ca-bundle\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.090865 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwlg6\" (UniqueName: \"kubernetes.io/projected/27996094-c657-451a-98fe-b960c1b88d31-kube-api-access-gwlg6\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.091140 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-scripts\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.091180 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-dns-svc\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.091206 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-config\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.091225 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.091246 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-config-data\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.091276 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-credential-keys\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.091295 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.091312 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-combined-ca-bundle\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.091331 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.091366 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l75cl\" (UniqueName: \"kubernetes.io/projected/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-kube-api-access-l75cl\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.091409 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-fernet-keys\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.101615 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-credential-keys\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.102446 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-combined-ca-bundle\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.103125 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-config-data\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.106348 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-scripts\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.107481 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-fernet-keys\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.126530 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-m7rwg"] Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.134714 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.145649 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-npsxx" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.146149 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.146357 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.154637 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwlg6\" (UniqueName: \"kubernetes.io/projected/27996094-c657-451a-98fe-b960c1b88d31-kube-api-access-gwlg6\") pod \"keystone-bootstrap-rrz6s\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.193319 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l75cl\" (UniqueName: \"kubernetes.io/projected/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-kube-api-access-l75cl\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.193452 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-dns-svc\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.193481 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-config\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.193499 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.193522 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.193542 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.194337 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.195186 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-dns-svc\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.206042 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.206631 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.209112 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-config\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.211126 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.211205 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-57f4c4598f-qtdw2"] Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.212560 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.216002 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.216219 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.226179 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.226514 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-gllwv" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.242119 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-m7rwg"] Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.298675 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-config-data\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.298745 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-scripts\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.298880 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6q57\" (UniqueName: \"kubernetes.io/projected/55568564-a701-4c16-b5c4-617f88c364a5-kube-api-access-j6q57\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.298916 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-combined-ca-bundle\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.298995 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55568564-a701-4c16-b5c4-617f88c364a5-etc-machine-id\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.299050 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-db-sync-config-data\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.334038 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57f4c4598f-qtdw2"] Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.337379 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l75cl\" (UniqueName: \"kubernetes.io/projected/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-kube-api-access-l75cl\") pod \"dnsmasq-dns-5959f8865f-f4kld\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.352868 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-tb6cr"] Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.354442 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tb6cr" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.370216 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.370554 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2ctzv" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.370849 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.387224 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.389063 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.400602 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-combined-ca-bundle\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.400860 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55568564-a701-4c16-b5c4-617f88c364a5-etc-machine-id\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.401972 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55568564-a701-4c16-b5c4-617f88c364a5-etc-machine-id\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.405254 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.406590 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-combined-ca-bundle\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.406592 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aec8eff5-d865-4c47-acda-51609f6df4b6-logs\") pod \"horizon-57f4c4598f-qtdw2\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.407008 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-db-sync-config-data\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.409850 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aec8eff5-d865-4c47-acda-51609f6df4b6-config-data\") pod \"horizon-57f4c4598f-qtdw2\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.409884 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j25d2\" (UniqueName: \"kubernetes.io/projected/aec8eff5-d865-4c47-acda-51609f6df4b6-kube-api-access-j25d2\") pod \"horizon-57f4c4598f-qtdw2\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.409916 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-config-data\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.409943 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aec8eff5-d865-4c47-acda-51609f6df4b6-horizon-secret-key\") pod \"horizon-57f4c4598f-qtdw2\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.409964 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-scripts\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.409982 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aec8eff5-d865-4c47-acda-51609f6df4b6-scripts\") pod \"horizon-57f4c4598f-qtdw2\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.410086 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6q57\" (UniqueName: \"kubernetes.io/projected/55568564-a701-4c16-b5c4-617f88c364a5-kube-api-access-j6q57\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.417941 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-config-data\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.422133 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-scripts\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.435903 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-tb6cr"] Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.436561 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-db-sync-config-data\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.448926 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.465749 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.486428 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6q57\" (UniqueName: \"kubernetes.io/projected/55568564-a701-4c16-b5c4-617f88c364a5-kube-api-access-j6q57\") pod \"cinder-db-sync-m7rwg\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.512657 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1916415-d4eb-4dbd-bccb-ac932a09843c-config\") pod \"neutron-db-sync-tb6cr\" (UID: \"b1916415-d4eb-4dbd-bccb-ac932a09843c\") " pod="openstack/neutron-db-sync-tb6cr" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.512802 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk4vn\" (UniqueName: \"kubernetes.io/projected/b1916415-d4eb-4dbd-bccb-ac932a09843c-kube-api-access-dk4vn\") pod \"neutron-db-sync-tb6cr\" (UID: \"b1916415-d4eb-4dbd-bccb-ac932a09843c\") " pod="openstack/neutron-db-sync-tb6cr" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.512838 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-scripts\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.512886 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5v52\" (UniqueName: \"kubernetes.io/projected/c311e091-7cf1-426b-9788-a3d64b198e43-kube-api-access-m5v52\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.513324 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c311e091-7cf1-426b-9788-a3d64b198e43-run-httpd\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.513363 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aec8eff5-d865-4c47-acda-51609f6df4b6-logs\") pod \"horizon-57f4c4598f-qtdw2\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.513420 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.513464 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c311e091-7cf1-426b-9788-a3d64b198e43-log-httpd\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.513492 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aec8eff5-d865-4c47-acda-51609f6df4b6-config-data\") pod \"horizon-57f4c4598f-qtdw2\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.513511 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j25d2\" (UniqueName: \"kubernetes.io/projected/aec8eff5-d865-4c47-acda-51609f6df4b6-kube-api-access-j25d2\") pod \"horizon-57f4c4598f-qtdw2\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.513537 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.513569 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aec8eff5-d865-4c47-acda-51609f6df4b6-horizon-secret-key\") pod \"horizon-57f4c4598f-qtdw2\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.513594 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aec8eff5-d865-4c47-acda-51609f6df4b6-scripts\") pod \"horizon-57f4c4598f-qtdw2\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.513618 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-config-data\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.513643 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1916415-d4eb-4dbd-bccb-ac932a09843c-combined-ca-bundle\") pod \"neutron-db-sync-tb6cr\" (UID: \"b1916415-d4eb-4dbd-bccb-ac932a09843c\") " pod="openstack/neutron-db-sync-tb6cr" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.514132 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aec8eff5-d865-4c47-acda-51609f6df4b6-logs\") pod \"horizon-57f4c4598f-qtdw2\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.515218 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aec8eff5-d865-4c47-acda-51609f6df4b6-config-data\") pod \"horizon-57f4c4598f-qtdw2\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.517248 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aec8eff5-d865-4c47-acda-51609f6df4b6-scripts\") pod \"horizon-57f4c4598f-qtdw2\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.521645 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aec8eff5-d865-4c47-acda-51609f6df4b6-horizon-secret-key\") pod \"horizon-57f4c4598f-qtdw2\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.554260 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.556748 4580 generic.go:334] "Generic (PLEG): container finished" podID="83aac843-d8d4-4ad6-ba6a-6139d0197002" containerID="afaded7a1d4c13b1795a3867e7f5743d3f099d25044f7f2e32754f806f001585" exitCode=0 Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.556958 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" event={"ID":"83aac843-d8d4-4ad6-ba6a-6139d0197002","Type":"ContainerDied","Data":"afaded7a1d4c13b1795a3867e7f5743d3f099d25044f7f2e32754f806f001585"} Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.557046 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" event={"ID":"83aac843-d8d4-4ad6-ba6a-6139d0197002","Type":"ContainerStarted","Data":"e9661e9ddc45c379f47cd0088432f34c0fbd766d0d78e0772a9d47bcad8b208e"} Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.557494 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-cgz9l"] Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.559116 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.571995 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.619867 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.620413 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6mvvp" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.620555 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.620117 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.620699 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-combined-ca-bundle\") pod \"placement-db-sync-cgz9l\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.620744 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-config-data\") pod \"placement-db-sync-cgz9l\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.620770 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2234c053-0318-4d03-8e9f-b9ee569529fc-logs\") pod \"placement-db-sync-cgz9l\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.620848 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-config-data\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.620875 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1916415-d4eb-4dbd-bccb-ac932a09843c-combined-ca-bundle\") pod \"neutron-db-sync-tb6cr\" (UID: \"b1916415-d4eb-4dbd-bccb-ac932a09843c\") " pod="openstack/neutron-db-sync-tb6cr" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.620944 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d6h4\" (UniqueName: \"kubernetes.io/projected/2234c053-0318-4d03-8e9f-b9ee569529fc-kube-api-access-5d6h4\") pod \"placement-db-sync-cgz9l\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.620979 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1916415-d4eb-4dbd-bccb-ac932a09843c-config\") pod \"neutron-db-sync-tb6cr\" (UID: \"b1916415-d4eb-4dbd-bccb-ac932a09843c\") " pod="openstack/neutron-db-sync-tb6cr" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.621001 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk4vn\" (UniqueName: \"kubernetes.io/projected/b1916415-d4eb-4dbd-bccb-ac932a09843c-kube-api-access-dk4vn\") pod \"neutron-db-sync-tb6cr\" (UID: \"b1916415-d4eb-4dbd-bccb-ac932a09843c\") " pod="openstack/neutron-db-sync-tb6cr" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.621024 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-scripts\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.621055 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5v52\" (UniqueName: \"kubernetes.io/projected/c311e091-7cf1-426b-9788-a3d64b198e43-kube-api-access-m5v52\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.621106 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-scripts\") pod \"placement-db-sync-cgz9l\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.621162 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c311e091-7cf1-426b-9788-a3d64b198e43-run-httpd\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.621197 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.621253 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c311e091-7cf1-426b-9788-a3d64b198e43-log-httpd\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.621730 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c311e091-7cf1-426b-9788-a3d64b198e43-log-httpd\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.623956 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c311e091-7cf1-426b-9788-a3d64b198e43-run-httpd\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.676673 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-scripts\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.726111 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j25d2\" (UniqueName: \"kubernetes.io/projected/aec8eff5-d865-4c47-acda-51609f6df4b6-kube-api-access-j25d2\") pod \"horizon-57f4c4598f-qtdw2\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.726521 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.727206 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.728250 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-config-data\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.744242 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1916415-d4eb-4dbd-bccb-ac932a09843c-combined-ca-bundle\") pod \"neutron-db-sync-tb6cr\" (UID: \"b1916415-d4eb-4dbd-bccb-ac932a09843c\") " pod="openstack/neutron-db-sync-tb6cr" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.750409 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-combined-ca-bundle\") pod \"placement-db-sync-cgz9l\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.779208 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-config-data\") pod \"placement-db-sync-cgz9l\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.784130 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2234c053-0318-4d03-8e9f-b9ee569529fc-logs\") pod \"placement-db-sync-cgz9l\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.784477 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d6h4\" (UniqueName: \"kubernetes.io/projected/2234c053-0318-4d03-8e9f-b9ee569529fc-kube-api-access-5d6h4\") pod \"placement-db-sync-cgz9l\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.784838 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-scripts\") pod \"placement-db-sync-cgz9l\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.789596 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2234c053-0318-4d03-8e9f-b9ee569529fc-logs\") pod \"placement-db-sync-cgz9l\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.800459 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk4vn\" (UniqueName: \"kubernetes.io/projected/b1916415-d4eb-4dbd-bccb-ac932a09843c-kube-api-access-dk4vn\") pod \"neutron-db-sync-tb6cr\" (UID: \"b1916415-d4eb-4dbd-bccb-ac932a09843c\") " pod="openstack/neutron-db-sync-tb6cr" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.783346 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cgz9l"] Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.860242 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-f2ksl"] Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.831596 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-combined-ca-bundle\") pod \"placement-db-sync-cgz9l\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.853049 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-scripts\") pod \"placement-db-sync-cgz9l\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.860089 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1916415-d4eb-4dbd-bccb-ac932a09843c-config\") pod \"neutron-db-sync-tb6cr\" (UID: \"b1916415-d4eb-4dbd-bccb-ac932a09843c\") " pod="openstack/neutron-db-sync-tb6cr" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.834520 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-config-data\") pod \"placement-db-sync-cgz9l\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.862141 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f2ksl" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.866585 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5v52\" (UniqueName: \"kubernetes.io/projected/c311e091-7cf1-426b-9788-a3d64b198e43-kube-api-access-m5v52\") pod \"ceilometer-0\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " pod="openstack/ceilometer-0" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.875656 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tlsfx" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.876064 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.934660 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d6h4\" (UniqueName: \"kubernetes.io/projected/2234c053-0318-4d03-8e9f-b9ee569529fc-kube-api-access-5d6h4\") pod \"placement-db-sync-cgz9l\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.988072 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b5d99966f-s6s2j"] Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.990241 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.993313 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3bfbab08-aee7-43bf-9118-252682438c95-db-sync-config-data\") pod \"barbican-db-sync-f2ksl\" (UID: \"3bfbab08-aee7-43bf-9118-252682438c95\") " pod="openstack/barbican-db-sync-f2ksl" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.994287 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfbab08-aee7-43bf-9118-252682438c95-combined-ca-bundle\") pod \"barbican-db-sync-f2ksl\" (UID: \"3bfbab08-aee7-43bf-9118-252682438c95\") " pod="openstack/barbican-db-sync-f2ksl" Mar 21 05:12:21 crc kubenswrapper[4580]: I0321 05:12:21.994426 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9cdd\" (UniqueName: \"kubernetes.io/projected/3bfbab08-aee7-43bf-9118-252682438c95-kube-api-access-d9cdd\") pod \"barbican-db-sync-f2ksl\" (UID: \"3bfbab08-aee7-43bf-9118-252682438c95\") " pod="openstack/barbican-db-sync-f2ksl" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.010266 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.036071 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-f4kld"] Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.037755 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tb6cr" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.053872 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-f2ksl"] Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.064819 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.070326 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b5d99966f-s6s2j"] Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.097056 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z94fz\" (UniqueName: \"kubernetes.io/projected/e829eb3c-14a4-40d6-904e-483dbe3cb066-kube-api-access-z94fz\") pod \"horizon-7b5d99966f-s6s2j\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.097107 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e829eb3c-14a4-40d6-904e-483dbe3cb066-config-data\") pod \"horizon-7b5d99966f-s6s2j\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.097172 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e829eb3c-14a4-40d6-904e-483dbe3cb066-scripts\") pod \"horizon-7b5d99966f-s6s2j\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.097217 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e829eb3c-14a4-40d6-904e-483dbe3cb066-horizon-secret-key\") pod \"horizon-7b5d99966f-s6s2j\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.097278 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3bfbab08-aee7-43bf-9118-252682438c95-db-sync-config-data\") pod \"barbican-db-sync-f2ksl\" (UID: \"3bfbab08-aee7-43bf-9118-252682438c95\") " pod="openstack/barbican-db-sync-f2ksl" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.098552 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfbab08-aee7-43bf-9118-252682438c95-combined-ca-bundle\") pod \"barbican-db-sync-f2ksl\" (UID: \"3bfbab08-aee7-43bf-9118-252682438c95\") " pod="openstack/barbican-db-sync-f2ksl" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.098586 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e829eb3c-14a4-40d6-904e-483dbe3cb066-logs\") pod \"horizon-7b5d99966f-s6s2j\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.098627 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9cdd\" (UniqueName: \"kubernetes.io/projected/3bfbab08-aee7-43bf-9118-252682438c95-kube-api-access-d9cdd\") pod \"barbican-db-sync-f2ksl\" (UID: \"3bfbab08-aee7-43bf-9118-252682438c95\") " pod="openstack/barbican-db-sync-f2ksl" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.120853 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfbab08-aee7-43bf-9118-252682438c95-combined-ca-bundle\") pod \"barbican-db-sync-f2ksl\" (UID: \"3bfbab08-aee7-43bf-9118-252682438c95\") " pod="openstack/barbican-db-sync-f2ksl" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.150372 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3bfbab08-aee7-43bf-9118-252682438c95-db-sync-config-data\") pod \"barbican-db-sync-f2ksl\" (UID: \"3bfbab08-aee7-43bf-9118-252682438c95\") " pod="openstack/barbican-db-sync-f2ksl" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.153375 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9cdd\" (UniqueName: \"kubernetes.io/projected/3bfbab08-aee7-43bf-9118-252682438c95-kube-api-access-d9cdd\") pod \"barbican-db-sync-f2ksl\" (UID: \"3bfbab08-aee7-43bf-9118-252682438c95\") " pod="openstack/barbican-db-sync-f2ksl" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.174887 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-5tp6t"] Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.197448 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.200283 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e829eb3c-14a4-40d6-904e-483dbe3cb066-scripts\") pod \"horizon-7b5d99966f-s6s2j\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.200363 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e829eb3c-14a4-40d6-904e-483dbe3cb066-horizon-secret-key\") pod \"horizon-7b5d99966f-s6s2j\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.200411 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e829eb3c-14a4-40d6-904e-483dbe3cb066-logs\") pod \"horizon-7b5d99966f-s6s2j\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.200490 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z94fz\" (UniqueName: \"kubernetes.io/projected/e829eb3c-14a4-40d6-904e-483dbe3cb066-kube-api-access-z94fz\") pod \"horizon-7b5d99966f-s6s2j\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.200526 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e829eb3c-14a4-40d6-904e-483dbe3cb066-config-data\") pod \"horizon-7b5d99966f-s6s2j\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.205461 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e829eb3c-14a4-40d6-904e-483dbe3cb066-config-data\") pod \"horizon-7b5d99966f-s6s2j\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.205730 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e829eb3c-14a4-40d6-904e-483dbe3cb066-logs\") pod \"horizon-7b5d99966f-s6s2j\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.207319 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e829eb3c-14a4-40d6-904e-483dbe3cb066-scripts\") pod \"horizon-7b5d99966f-s6s2j\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.219478 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cgz9l" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.242401 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e829eb3c-14a4-40d6-904e-483dbe3cb066-horizon-secret-key\") pod \"horizon-7b5d99966f-s6s2j\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.242970 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f2ksl" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.257953 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z94fz\" (UniqueName: \"kubernetes.io/projected/e829eb3c-14a4-40d6-904e-483dbe3cb066-kube-api-access-z94fz\") pod \"horizon-7b5d99966f-s6s2j\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.274905 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-5tp6t"] Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.303755 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.303883 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.303916 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.303939 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvz6p\" (UniqueName: \"kubernetes.io/projected/cf72524d-be2d-4051-a966-4d0cbfb2523e-kube-api-access-nvz6p\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.303989 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-config\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.304035 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.372936 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.408067 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.412663 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.413444 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.422481 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.422629 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvz6p\" (UniqueName: \"kubernetes.io/projected/cf72524d-be2d-4051-a966-4d0cbfb2523e-kube-api-access-nvz6p\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.422825 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-config\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.422978 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.426658 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-config\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.426930 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.427733 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.430438 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.463030 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvz6p\" (UniqueName: \"kubernetes.io/projected/cf72524d-be2d-4051-a966-4d0cbfb2523e-kube-api-access-nvz6p\") pod \"dnsmasq-dns-58dd9ff6bc-5tp6t\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.602672 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rrz6s"] Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.732983 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.922472 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.955908 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9qxn\" (UniqueName: \"kubernetes.io/projected/83aac843-d8d4-4ad6-ba6a-6139d0197002-kube-api-access-b9qxn\") pod \"83aac843-d8d4-4ad6-ba6a-6139d0197002\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.956010 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-dns-swift-storage-0\") pod \"83aac843-d8d4-4ad6-ba6a-6139d0197002\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.956086 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-ovsdbserver-sb\") pod \"83aac843-d8d4-4ad6-ba6a-6139d0197002\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.956150 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-ovsdbserver-nb\") pod \"83aac843-d8d4-4ad6-ba6a-6139d0197002\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.968557 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83aac843-d8d4-4ad6-ba6a-6139d0197002-kube-api-access-b9qxn" (OuterVolumeSpecName: "kube-api-access-b9qxn") pod "83aac843-d8d4-4ad6-ba6a-6139d0197002" (UID: "83aac843-d8d4-4ad6-ba6a-6139d0197002"). InnerVolumeSpecName "kube-api-access-b9qxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.991736 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "83aac843-d8d4-4ad6-ba6a-6139d0197002" (UID: "83aac843-d8d4-4ad6-ba6a-6139d0197002"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:12:22 crc kubenswrapper[4580]: I0321 05:12:22.998391 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "83aac843-d8d4-4ad6-ba6a-6139d0197002" (UID: "83aac843-d8d4-4ad6-ba6a-6139d0197002"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.003280 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "83aac843-d8d4-4ad6-ba6a-6139d0197002" (UID: "83aac843-d8d4-4ad6-ba6a-6139d0197002"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.059970 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-config\") pod \"83aac843-d8d4-4ad6-ba6a-6139d0197002\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.060343 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-dns-svc\") pod \"83aac843-d8d4-4ad6-ba6a-6139d0197002\" (UID: \"83aac843-d8d4-4ad6-ba6a-6139d0197002\") " Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.060989 4580 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.061008 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.061021 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.061033 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9qxn\" (UniqueName: \"kubernetes.io/projected/83aac843-d8d4-4ad6-ba6a-6139d0197002-kube-api-access-b9qxn\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.091698 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "83aac843-d8d4-4ad6-ba6a-6139d0197002" (UID: "83aac843-d8d4-4ad6-ba6a-6139d0197002"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.117106 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-config" (OuterVolumeSpecName: "config") pod "83aac843-d8d4-4ad6-ba6a-6139d0197002" (UID: "83aac843-d8d4-4ad6-ba6a-6139d0197002"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.163075 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.163118 4580 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83aac843-d8d4-4ad6-ba6a-6139d0197002-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.310993 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57f4c4598f-qtdw2"] Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.360940 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-f4kld"] Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.396809 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-tb6cr"] Mar 21 05:12:23 crc kubenswrapper[4580]: W0321 05:12:23.401390 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6b87fba_7d5e_4e85_ad9b_eabc152a1675.slice/crio-dab763dc7e0637da5815a89e1a352b3b0c379cacdfd8ffc8d67528ab62081403 WatchSource:0}: Error finding container dab763dc7e0637da5815a89e1a352b3b0c379cacdfd8ffc8d67528ab62081403: Status 404 returned error can't find the container with id dab763dc7e0637da5815a89e1a352b3b0c379cacdfd8ffc8d67528ab62081403 Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.415900 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-m7rwg"] Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.441595 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cgz9l"] Mar 21 05:12:23 crc kubenswrapper[4580]: W0321 05:12:23.452025 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1916415_d4eb_4dbd_bccb_ac932a09843c.slice/crio-38061da3297295905c0818b842451978ca28a011c31eab67453c6d62cc4ccbf2 WatchSource:0}: Error finding container 38061da3297295905c0818b842451978ca28a011c31eab67453c6d62cc4ccbf2: Status 404 returned error can't find the container with id 38061da3297295905c0818b842451978ca28a011c31eab67453c6d62cc4ccbf2 Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.480713 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.552493 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-f2ksl"] Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.607815 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-5tp6t"] Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.614892 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-f4kld" event={"ID":"b6b87fba-7d5e-4e85-ad9b-eabc152a1675","Type":"ContainerStarted","Data":"dab763dc7e0637da5815a89e1a352b3b0c379cacdfd8ffc8d67528ab62081403"} Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.666623 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.676811 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b5d99966f-s6s2j"] Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.676853 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rrz6s" event={"ID":"27996094-c657-451a-98fe-b960c1b88d31","Type":"ContainerStarted","Data":"d0014e99e2f2d74cb5b862030bdc3f140458004263e57eda6acee87acd800e7c"} Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.676874 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rrz6s" event={"ID":"27996094-c657-451a-98fe-b960c1b88d31","Type":"ContainerStarted","Data":"7ff85440fddfb864e5e73511954511c506f315eed8448e53aac8543b52400dd8"} Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.676886 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-v4rvs" event={"ID":"83aac843-d8d4-4ad6-ba6a-6139d0197002","Type":"ContainerDied","Data":"e9661e9ddc45c379f47cd0088432f34c0fbd766d0d78e0772a9d47bcad8b208e"} Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.676907 4580 scope.go:117] "RemoveContainer" containerID="afaded7a1d4c13b1795a3867e7f5743d3f099d25044f7f2e32754f806f001585" Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.697480 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cgz9l" event={"ID":"2234c053-0318-4d03-8e9f-b9ee569529fc","Type":"ContainerStarted","Data":"71e1c43ecf211a081a5346423cfe9b1d5914d19409029eb95667bb92169b8fc5"} Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.704941 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57f4c4598f-qtdw2" event={"ID":"aec8eff5-d865-4c47-acda-51609f6df4b6","Type":"ContainerStarted","Data":"27dd96578092090b12c72eb3856a9f2ac3807c98c28fe386c7d8f43751525e22"} Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.707721 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tb6cr" event={"ID":"b1916415-d4eb-4dbd-bccb-ac932a09843c","Type":"ContainerStarted","Data":"38061da3297295905c0818b842451978ca28a011c31eab67453c6d62cc4ccbf2"} Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.708864 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f2ksl" event={"ID":"3bfbab08-aee7-43bf-9118-252682438c95","Type":"ContainerStarted","Data":"62c79f8b2bded3641cf7a2d8bfc6f0792c50545e9d4d4426b00640accdd01c86"} Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.710656 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m7rwg" event={"ID":"55568564-a701-4c16-b5c4-617f88c364a5","Type":"ContainerStarted","Data":"50ffd984b7e6da06c3d33d9f673c1055627920b7b2eb422080c870a96685400d"} Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.717398 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c311e091-7cf1-426b-9788-a3d64b198e43","Type":"ContainerStarted","Data":"8a6e526f983bd78cd69797adaa7f597120eb814df47b399d7701ac69c847533d"} Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.724020 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" event={"ID":"cf72524d-be2d-4051-a966-4d0cbfb2523e","Type":"ContainerStarted","Data":"8ba69dfde9e2ad60f55f82c17533eaa48587760d235be8fe93071a1c3be66131"} Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.736604 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rrz6s" podStartSLOduration=3.736582039 podStartE2EDuration="3.736582039s" podCreationTimestamp="2026-03-21 05:12:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:12:23.69108548 +0000 UTC m=+1248.773669118" watchObservedRunningTime="2026-03-21 05:12:23.736582039 +0000 UTC m=+1248.819165667" Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.783999 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v4rvs"] Mar 21 05:12:23 crc kubenswrapper[4580]: I0321 05:12:23.806864 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v4rvs"] Mar 21 05:12:24 crc kubenswrapper[4580]: I0321 05:12:24.742103 4580 generic.go:334] "Generic (PLEG): container finished" podID="cf72524d-be2d-4051-a966-4d0cbfb2523e" containerID="f714f7b375bf2479c501f8fadaefd888ccca1a257ad4775c47ca7d73f68007d5" exitCode=0 Mar 21 05:12:24 crc kubenswrapper[4580]: I0321 05:12:24.742185 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" event={"ID":"cf72524d-be2d-4051-a966-4d0cbfb2523e","Type":"ContainerDied","Data":"f714f7b375bf2479c501f8fadaefd888ccca1a257ad4775c47ca7d73f68007d5"} Mar 21 05:12:24 crc kubenswrapper[4580]: I0321 05:12:24.749062 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b5d99966f-s6s2j" event={"ID":"e829eb3c-14a4-40d6-904e-483dbe3cb066","Type":"ContainerStarted","Data":"d79734d0e603cbaec138d59e662806c1033fd5a7b9db3aa5dc51f055061b5bbc"} Mar 21 05:12:24 crc kubenswrapper[4580]: I0321 05:12:24.760863 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tb6cr" event={"ID":"b1916415-d4eb-4dbd-bccb-ac932a09843c","Type":"ContainerStarted","Data":"5271d2e1c3d8b200abfe020ec89602d597e379128fb07749c2f006b86f466527"} Mar 21 05:12:24 crc kubenswrapper[4580]: I0321 05:12:24.764431 4580 generic.go:334] "Generic (PLEG): container finished" podID="b6b87fba-7d5e-4e85-ad9b-eabc152a1675" containerID="3d39f76a43d90e238fb2d5478f748252bc7c1fd66ec07e6088710278e2c535bc" exitCode=0 Mar 21 05:12:24 crc kubenswrapper[4580]: I0321 05:12:24.764752 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-f4kld" event={"ID":"b6b87fba-7d5e-4e85-ad9b-eabc152a1675","Type":"ContainerDied","Data":"3d39f76a43d90e238fb2d5478f748252bc7c1fd66ec07e6088710278e2c535bc"} Mar 21 05:12:24 crc kubenswrapper[4580]: I0321 05:12:24.814966 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-tb6cr" podStartSLOduration=3.814944543 podStartE2EDuration="3.814944543s" podCreationTimestamp="2026-03-21 05:12:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:12:24.803815007 +0000 UTC m=+1249.886398635" watchObservedRunningTime="2026-03-21 05:12:24.814944543 +0000 UTC m=+1249.897528171" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.397968 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57f4c4598f-qtdw2"] Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.412700 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.432580 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.467586 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-56877c58ff-zj7t6"] Mar 21 05:12:25 crc kubenswrapper[4580]: E0321 05:12:25.467995 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b87fba-7d5e-4e85-ad9b-eabc152a1675" containerName="init" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.468015 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b87fba-7d5e-4e85-ad9b-eabc152a1675" containerName="init" Mar 21 05:12:25 crc kubenswrapper[4580]: E0321 05:12:25.468035 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83aac843-d8d4-4ad6-ba6a-6139d0197002" containerName="init" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.468042 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="83aac843-d8d4-4ad6-ba6a-6139d0197002" containerName="init" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.468246 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b87fba-7d5e-4e85-ad9b-eabc152a1675" containerName="init" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.468270 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="83aac843-d8d4-4ad6-ba6a-6139d0197002" containerName="init" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.474565 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.524595 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56877c58ff-zj7t6"] Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.544317 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-config\") pod \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.544492 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-ovsdbserver-nb\") pod \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.544552 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-dns-svc\") pod \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.544615 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l75cl\" (UniqueName: \"kubernetes.io/projected/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-kube-api-access-l75cl\") pod \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.544642 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-dns-swift-storage-0\") pod \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.544694 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-ovsdbserver-sb\") pod \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\" (UID: \"b6b87fba-7d5e-4e85-ad9b-eabc152a1675\") " Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.544890 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6895574-7c6f-4a69-a821-8f6ce5b33506-scripts\") pod \"horizon-56877c58ff-zj7t6\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.544955 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r2qg\" (UniqueName: \"kubernetes.io/projected/e6895574-7c6f-4a69-a821-8f6ce5b33506-kube-api-access-8r2qg\") pod \"horizon-56877c58ff-zj7t6\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.545010 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6895574-7c6f-4a69-a821-8f6ce5b33506-config-data\") pod \"horizon-56877c58ff-zj7t6\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.545064 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6895574-7c6f-4a69-a821-8f6ce5b33506-horizon-secret-key\") pod \"horizon-56877c58ff-zj7t6\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.545085 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6895574-7c6f-4a69-a821-8f6ce5b33506-logs\") pod \"horizon-56877c58ff-zj7t6\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.605988 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-config" (OuterVolumeSpecName: "config") pod "b6b87fba-7d5e-4e85-ad9b-eabc152a1675" (UID: "b6b87fba-7d5e-4e85-ad9b-eabc152a1675"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.608114 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b6b87fba-7d5e-4e85-ad9b-eabc152a1675" (UID: "b6b87fba-7d5e-4e85-ad9b-eabc152a1675"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.621652 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6b87fba-7d5e-4e85-ad9b-eabc152a1675" (UID: "b6b87fba-7d5e-4e85-ad9b-eabc152a1675"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.624222 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-kube-api-access-l75cl" (OuterVolumeSpecName: "kube-api-access-l75cl") pod "b6b87fba-7d5e-4e85-ad9b-eabc152a1675" (UID: "b6b87fba-7d5e-4e85-ad9b-eabc152a1675"). InnerVolumeSpecName "kube-api-access-l75cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.626324 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b6b87fba-7d5e-4e85-ad9b-eabc152a1675" (UID: "b6b87fba-7d5e-4e85-ad9b-eabc152a1675"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.644866 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b6b87fba-7d5e-4e85-ad9b-eabc152a1675" (UID: "b6b87fba-7d5e-4e85-ad9b-eabc152a1675"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.647078 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6895574-7c6f-4a69-a821-8f6ce5b33506-horizon-secret-key\") pod \"horizon-56877c58ff-zj7t6\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.647125 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6895574-7c6f-4a69-a821-8f6ce5b33506-logs\") pod \"horizon-56877c58ff-zj7t6\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.647176 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6895574-7c6f-4a69-a821-8f6ce5b33506-scripts\") pod \"horizon-56877c58ff-zj7t6\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.647241 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r2qg\" (UniqueName: \"kubernetes.io/projected/e6895574-7c6f-4a69-a821-8f6ce5b33506-kube-api-access-8r2qg\") pod \"horizon-56877c58ff-zj7t6\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.647302 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6895574-7c6f-4a69-a821-8f6ce5b33506-config-data\") pod \"horizon-56877c58ff-zj7t6\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.647397 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.647417 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.647429 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.647440 4580 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.647453 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l75cl\" (UniqueName: \"kubernetes.io/projected/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-kube-api-access-l75cl\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.647466 4580 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6b87fba-7d5e-4e85-ad9b-eabc152a1675-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.648472 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6895574-7c6f-4a69-a821-8f6ce5b33506-config-data\") pod \"horizon-56877c58ff-zj7t6\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.649432 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6895574-7c6f-4a69-a821-8f6ce5b33506-scripts\") pod \"horizon-56877c58ff-zj7t6\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.649647 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6895574-7c6f-4a69-a821-8f6ce5b33506-logs\") pod \"horizon-56877c58ff-zj7t6\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.652690 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6895574-7c6f-4a69-a821-8f6ce5b33506-horizon-secret-key\") pod \"horizon-56877c58ff-zj7t6\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.663771 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83aac843-d8d4-4ad6-ba6a-6139d0197002" path="/var/lib/kubelet/pods/83aac843-d8d4-4ad6-ba6a-6139d0197002/volumes" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.679360 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r2qg\" (UniqueName: \"kubernetes.io/projected/e6895574-7c6f-4a69-a821-8f6ce5b33506-kube-api-access-8r2qg\") pod \"horizon-56877c58ff-zj7t6\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.794048 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.815211 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" event={"ID":"cf72524d-be2d-4051-a966-4d0cbfb2523e","Type":"ContainerStarted","Data":"d61cd1fa45b0e5a3d303367f2ba6300dc6aa59e59a82a07987f668ee101ff12f"} Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.815657 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.830749 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n4vzq" event={"ID":"f86c26eb-bb22-460c-8a45-191a02924112","Type":"ContainerStarted","Data":"3ff385e1bbcee324592b75b1425bea21d93e458294c849c13f312dc58af9311a"} Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.851590 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-f4kld" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.852084 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-f4kld" event={"ID":"b6b87fba-7d5e-4e85-ad9b-eabc152a1675","Type":"ContainerDied","Data":"dab763dc7e0637da5815a89e1a352b3b0c379cacdfd8ffc8d67528ab62081403"} Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.852151 4580 scope.go:117] "RemoveContainer" containerID="3d39f76a43d90e238fb2d5478f748252bc7c1fd66ec07e6088710278e2c535bc" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.858677 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" podStartSLOduration=4.858661775 podStartE2EDuration="4.858661775s" podCreationTimestamp="2026-03-21 05:12:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:12:25.840859232 +0000 UTC m=+1250.923442870" watchObservedRunningTime="2026-03-21 05:12:25.858661775 +0000 UTC m=+1250.941245403" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.877069 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-n4vzq" podStartSLOduration=3.409366779 podStartE2EDuration="37.877048604s" podCreationTimestamp="2026-03-21 05:11:48 +0000 UTC" firstStartedPulling="2026-03-21 05:11:49.634384843 +0000 UTC m=+1214.716968471" lastFinishedPulling="2026-03-21 05:12:24.102066668 +0000 UTC m=+1249.184650296" observedRunningTime="2026-03-21 05:12:25.862526158 +0000 UTC m=+1250.945109786" watchObservedRunningTime="2026-03-21 05:12:25.877048604 +0000 UTC m=+1250.959632232" Mar 21 05:12:25 crc kubenswrapper[4580]: I0321 05:12:25.986402 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-f4kld"] Mar 21 05:12:26 crc kubenswrapper[4580]: I0321 05:12:26.003107 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-f4kld"] Mar 21 05:12:26 crc kubenswrapper[4580]: I0321 05:12:26.601652 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56877c58ff-zj7t6"] Mar 21 05:12:26 crc kubenswrapper[4580]: W0321 05:12:26.631936 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6895574_7c6f_4a69_a821_8f6ce5b33506.slice/crio-c14246559dcb2876e3bd2af2d6f0bfb28d0b2929b29ef9ccdde6d85dfa0368d5 WatchSource:0}: Error finding container c14246559dcb2876e3bd2af2d6f0bfb28d0b2929b29ef9ccdde6d85dfa0368d5: Status 404 returned error can't find the container with id c14246559dcb2876e3bd2af2d6f0bfb28d0b2929b29ef9ccdde6d85dfa0368d5 Mar 21 05:12:26 crc kubenswrapper[4580]: I0321 05:12:26.864040 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56877c58ff-zj7t6" event={"ID":"e6895574-7c6f-4a69-a821-8f6ce5b33506","Type":"ContainerStarted","Data":"c14246559dcb2876e3bd2af2d6f0bfb28d0b2929b29ef9ccdde6d85dfa0368d5"} Mar 21 05:12:27 crc kubenswrapper[4580]: I0321 05:12:27.634053 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b87fba-7d5e-4e85-ad9b-eabc152a1675" path="/var/lib/kubelet/pods/b6b87fba-7d5e-4e85-ad9b-eabc152a1675/volumes" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.018832 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b5d99966f-s6s2j"] Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.070839 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-587cfc8688-265kc"] Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.072853 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.077666 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.078492 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-587cfc8688-265kc"] Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.156697 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-56877c58ff-zj7t6"] Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.185794 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67655f8b6-mbx6n"] Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.187730 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.202084 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67655f8b6-mbx6n"] Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.207094 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a0110f-428a-481d-b439-bc16e6837dc3-logs\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.207237 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-horizon-tls-certs\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.207274 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08a0110f-428a-481d-b439-bc16e6837dc3-scripts\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.207319 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z44z\" (UniqueName: \"kubernetes.io/projected/08a0110f-428a-481d-b439-bc16e6837dc3-kube-api-access-2z44z\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.207350 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08a0110f-428a-481d-b439-bc16e6837dc3-config-data\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.207377 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-combined-ca-bundle\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.207420 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-horizon-secret-key\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.308902 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-horizon-tls-certs\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.308958 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-combined-ca-bundle\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.308987 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08a0110f-428a-481d-b439-bc16e6837dc3-scripts\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.309005 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-logs\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.309026 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-scripts\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.309069 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z44z\" (UniqueName: \"kubernetes.io/projected/08a0110f-428a-481d-b439-bc16e6837dc3-kube-api-access-2z44z\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.309090 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08a0110f-428a-481d-b439-bc16e6837dc3-config-data\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.309107 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9892p\" (UniqueName: \"kubernetes.io/projected/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-kube-api-access-9892p\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.309130 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-combined-ca-bundle\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.309160 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-config-data\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.309181 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-horizon-secret-key\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.309198 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-horizon-tls-certs\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.309217 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a0110f-428a-481d-b439-bc16e6837dc3-logs\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.309278 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-horizon-secret-key\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.310491 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08a0110f-428a-481d-b439-bc16e6837dc3-scripts\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.310751 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a0110f-428a-481d-b439-bc16e6837dc3-logs\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.312036 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08a0110f-428a-481d-b439-bc16e6837dc3-config-data\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.317689 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-horizon-tls-certs\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.319254 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-horizon-secret-key\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.329393 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z44z\" (UniqueName: \"kubernetes.io/projected/08a0110f-428a-481d-b439-bc16e6837dc3-kube-api-access-2z44z\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.341092 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-combined-ca-bundle\") pod \"horizon-587cfc8688-265kc\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.393090 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.411629 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-horizon-tls-certs\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.411741 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-horizon-secret-key\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.411813 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-combined-ca-bundle\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.411835 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-logs\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.411853 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-scripts\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.411892 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9892p\" (UniqueName: \"kubernetes.io/projected/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-kube-api-access-9892p\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.411928 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-config-data\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.414207 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-scripts\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.416213 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-config-data\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.416437 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-logs\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.435007 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-horizon-secret-key\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.435319 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-combined-ca-bundle\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.435407 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-horizon-tls-certs\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.438631 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9892p\" (UniqueName: \"kubernetes.io/projected/a03ce0fa-f7e8-4b48-bbea-95807f14dd26-kube-api-access-9892p\") pod \"horizon-67655f8b6-mbx6n\" (UID: \"a03ce0fa-f7e8-4b48-bbea-95807f14dd26\") " pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:31 crc kubenswrapper[4580]: I0321 05:12:31.506529 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:12:32 crc kubenswrapper[4580]: I0321 05:12:32.736298 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:12:32 crc kubenswrapper[4580]: I0321 05:12:32.793010 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mtsm2"] Mar 21 05:12:32 crc kubenswrapper[4580]: I0321 05:12:32.793228 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-mtsm2" podUID="1bcfc888-3413-4eb9-a887-ef250a15962a" containerName="dnsmasq-dns" containerID="cri-o://d34a243adbe1462703a007a8dac8774f1d988e5dc334028b7c8492fa49467735" gracePeriod=10 Mar 21 05:12:32 crc kubenswrapper[4580]: I0321 05:12:32.971697 4580 generic.go:334] "Generic (PLEG): container finished" podID="1bcfc888-3413-4eb9-a887-ef250a15962a" containerID="d34a243adbe1462703a007a8dac8774f1d988e5dc334028b7c8492fa49467735" exitCode=0 Mar 21 05:12:32 crc kubenswrapper[4580]: I0321 05:12:32.971739 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mtsm2" event={"ID":"1bcfc888-3413-4eb9-a887-ef250a15962a","Type":"ContainerDied","Data":"d34a243adbe1462703a007a8dac8774f1d988e5dc334028b7c8492fa49467735"} Mar 21 05:12:33 crc kubenswrapper[4580]: E0321 05:12:33.071387 4580 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bcfc888_3413_4eb9_a887_ef250a15962a.slice/crio-d34a243adbe1462703a007a8dac8774f1d988e5dc334028b7c8492fa49467735.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bcfc888_3413_4eb9_a887_ef250a15962a.slice/crio-conmon-d34a243adbe1462703a007a8dac8774f1d988e5dc334028b7c8492fa49467735.scope\": RecentStats: unable to find data in memory cache]" Mar 21 05:12:33 crc kubenswrapper[4580]: I0321 05:12:33.231615 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-mtsm2" podUID="1bcfc888-3413-4eb9-a887-ef250a15962a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 21 05:12:33 crc kubenswrapper[4580]: I0321 05:12:33.981123 4580 generic.go:334] "Generic (PLEG): container finished" podID="27996094-c657-451a-98fe-b960c1b88d31" containerID="d0014e99e2f2d74cb5b862030bdc3f140458004263e57eda6acee87acd800e7c" exitCode=0 Mar 21 05:12:33 crc kubenswrapper[4580]: I0321 05:12:33.981165 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rrz6s" event={"ID":"27996094-c657-451a-98fe-b960c1b88d31","Type":"ContainerDied","Data":"d0014e99e2f2d74cb5b862030bdc3f140458004263e57eda6acee87acd800e7c"} Mar 21 05:12:38 crc kubenswrapper[4580]: I0321 05:12:38.232435 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-mtsm2" podUID="1bcfc888-3413-4eb9-a887-ef250a15962a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 21 05:12:42 crc kubenswrapper[4580]: E0321 05:12:42.711275 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Mar 21 05:12:42 crc kubenswrapper[4580]: E0321 05:12:42.712046 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5d6h4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-cgz9l_openstack(2234c053-0318-4d03-8e9f-b9ee569529fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:12:42 crc kubenswrapper[4580]: E0321 05:12:42.713469 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-cgz9l" podUID="2234c053-0318-4d03-8e9f-b9ee569529fc" Mar 21 05:12:42 crc kubenswrapper[4580]: E0321 05:12:42.771624 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 21 05:12:42 crc kubenswrapper[4580]: E0321 05:12:42.771804 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n685h576h685h5c5h565hf6h696h68h56bhffhd6h55h669h74h5cch55bh56dhddh556h555h545h9bh568h55bh554h75h5bch5f9h586h99h548h58dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j25d2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-57f4c4598f-qtdw2_openstack(aec8eff5-d865-4c47-acda-51609f6df4b6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:12:42 crc kubenswrapper[4580]: E0321 05:12:42.775537 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-57f4c4598f-qtdw2" podUID="aec8eff5-d865-4c47-acda-51609f6df4b6" Mar 21 05:12:42 crc kubenswrapper[4580]: E0321 05:12:42.817984 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 21 05:12:42 crc kubenswrapper[4580]: E0321 05:12:42.818140 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n87h66dh698hc5h5bch554h64bh9fh546h579h545h67dh5b4h58bh6bhcdh649h79h89h685h58ch67ch7dh567h9dh5f7h5dh686h5cbh9bhb6h5dfq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z94fz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7b5d99966f-s6s2j_openstack(e829eb3c-14a4-40d6-904e-483dbe3cb066): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:12:42 crc kubenswrapper[4580]: E0321 05:12:42.820878 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7b5d99966f-s6s2j" podUID="e829eb3c-14a4-40d6-904e-483dbe3cb066" Mar 21 05:12:42 crc kubenswrapper[4580]: E0321 05:12:42.913632 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 21 05:12:42 crc kubenswrapper[4580]: E0321 05:12:42.913888 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h577h5b9h68fh55dhcbh696h6h655h557hb8hb4hd4h695hd7hbbhb4h5cch9fh55fh599h664h574h66fh65hb4h97h5fch56dh7ch8bh584q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8r2qg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-56877c58ff-zj7t6_openstack(e6895574-7c6f-4a69-a821-8f6ce5b33506): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:12:42 crc kubenswrapper[4580]: E0321 05:12:42.918521 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-56877c58ff-zj7t6" podUID="e6895574-7c6f-4a69-a821-8f6ce5b33506" Mar 21 05:12:43 crc kubenswrapper[4580]: E0321 05:12:43.051152 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-cgz9l" podUID="2234c053-0318-4d03-8e9f-b9ee569529fc" Mar 21 05:12:43 crc kubenswrapper[4580]: I0321 05:12:43.231599 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-mtsm2" podUID="1bcfc888-3413-4eb9-a887-ef250a15962a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 21 05:12:43 crc kubenswrapper[4580]: I0321 05:12:43.232030 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:12:48 crc kubenswrapper[4580]: I0321 05:12:48.239252 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-mtsm2" podUID="1bcfc888-3413-4eb9-a887-ef250a15962a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 21 05:12:53 crc kubenswrapper[4580]: I0321 05:12:53.232049 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-mtsm2" podUID="1bcfc888-3413-4eb9-a887-ef250a15962a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 21 05:12:54 crc kubenswrapper[4580]: E0321 05:12:54.623144 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 21 05:12:54 crc kubenswrapper[4580]: E0321 05:12:54.623616 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n57dhd6h5c5h568h94h5f7h55h5cbh6fh584h547hbbhcchf5h4h557h5fch59h64h57h695h66bh55bh5fbh549h5fdh58fh5fbh669h67h586h5bbq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m5v52,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(c311e091-7cf1-426b-9788-a3d64b198e43): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.162813 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rrz6s" event={"ID":"27996094-c657-451a-98fe-b960c1b88d31","Type":"ContainerDied","Data":"7ff85440fddfb864e5e73511954511c506f315eed8448e53aac8543b52400dd8"} Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.163138 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ff85440fddfb864e5e73511954511c506f315eed8448e53aac8543b52400dd8" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.163981 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56877c58ff-zj7t6" event={"ID":"e6895574-7c6f-4a69-a821-8f6ce5b33506","Type":"ContainerDied","Data":"c14246559dcb2876e3bd2af2d6f0bfb28d0b2929b29ef9ccdde6d85dfa0368d5"} Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.164042 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c14246559dcb2876e3bd2af2d6f0bfb28d0b2929b29ef9ccdde6d85dfa0368d5" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.165026 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57f4c4598f-qtdw2" event={"ID":"aec8eff5-d865-4c47-acda-51609f6df4b6","Type":"ContainerDied","Data":"27dd96578092090b12c72eb3856a9f2ac3807c98c28fe386c7d8f43751525e22"} Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.165049 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27dd96578092090b12c72eb3856a9f2ac3807c98c28fe386c7d8f43751525e22" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.213801 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.235325 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.243245 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:55 crc kubenswrapper[4580]: E0321 05:12:55.248378 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 21 05:12:55 crc kubenswrapper[4580]: E0321 05:12:55.248548 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d9cdd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-f2ksl_openstack(3bfbab08-aee7-43bf-9118-252682438c95): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:12:55 crc kubenswrapper[4580]: E0321 05:12:55.249875 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-f2ksl" podUID="3bfbab08-aee7-43bf-9118-252682438c95" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.370425 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-combined-ca-bundle\") pod \"27996094-c657-451a-98fe-b960c1b88d31\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.370522 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6895574-7c6f-4a69-a821-8f6ce5b33506-scripts\") pod \"e6895574-7c6f-4a69-a821-8f6ce5b33506\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.370582 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aec8eff5-d865-4c47-acda-51609f6df4b6-logs\") pod \"aec8eff5-d865-4c47-acda-51609f6df4b6\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.370625 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6895574-7c6f-4a69-a821-8f6ce5b33506-horizon-secret-key\") pod \"e6895574-7c6f-4a69-a821-8f6ce5b33506\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.370691 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r2qg\" (UniqueName: \"kubernetes.io/projected/e6895574-7c6f-4a69-a821-8f6ce5b33506-kube-api-access-8r2qg\") pod \"e6895574-7c6f-4a69-a821-8f6ce5b33506\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.370725 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aec8eff5-d865-4c47-acda-51609f6df4b6-horizon-secret-key\") pod \"aec8eff5-d865-4c47-acda-51609f6df4b6\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.370750 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6895574-7c6f-4a69-a821-8f6ce5b33506-config-data\") pod \"e6895574-7c6f-4a69-a821-8f6ce5b33506\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.370801 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-config-data\") pod \"27996094-c657-451a-98fe-b960c1b88d31\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.370831 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aec8eff5-d865-4c47-acda-51609f6df4b6-config-data\") pod \"aec8eff5-d865-4c47-acda-51609f6df4b6\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.370861 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j25d2\" (UniqueName: \"kubernetes.io/projected/aec8eff5-d865-4c47-acda-51609f6df4b6-kube-api-access-j25d2\") pod \"aec8eff5-d865-4c47-acda-51609f6df4b6\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.370968 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aec8eff5-d865-4c47-acda-51609f6df4b6-logs" (OuterVolumeSpecName: "logs") pod "aec8eff5-d865-4c47-acda-51609f6df4b6" (UID: "aec8eff5-d865-4c47-acda-51609f6df4b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.371033 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-credential-keys\") pod \"27996094-c657-451a-98fe-b960c1b88d31\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.371090 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aec8eff5-d865-4c47-acda-51609f6df4b6-scripts\") pod \"aec8eff5-d865-4c47-acda-51609f6df4b6\" (UID: \"aec8eff5-d865-4c47-acda-51609f6df4b6\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.371120 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6895574-7c6f-4a69-a821-8f6ce5b33506-logs\") pod \"e6895574-7c6f-4a69-a821-8f6ce5b33506\" (UID: \"e6895574-7c6f-4a69-a821-8f6ce5b33506\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.371150 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-fernet-keys\") pod \"27996094-c657-451a-98fe-b960c1b88d31\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.371221 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwlg6\" (UniqueName: \"kubernetes.io/projected/27996094-c657-451a-98fe-b960c1b88d31-kube-api-access-gwlg6\") pod \"27996094-c657-451a-98fe-b960c1b88d31\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.371244 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-scripts\") pod \"27996094-c657-451a-98fe-b960c1b88d31\" (UID: \"27996094-c657-451a-98fe-b960c1b88d31\") " Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.371753 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aec8eff5-d865-4c47-acda-51609f6df4b6-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.371986 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6895574-7c6f-4a69-a821-8f6ce5b33506-config-data" (OuterVolumeSpecName: "config-data") pod "e6895574-7c6f-4a69-a821-8f6ce5b33506" (UID: "e6895574-7c6f-4a69-a821-8f6ce5b33506"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.372299 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6895574-7c6f-4a69-a821-8f6ce5b33506-scripts" (OuterVolumeSpecName: "scripts") pod "e6895574-7c6f-4a69-a821-8f6ce5b33506" (UID: "e6895574-7c6f-4a69-a821-8f6ce5b33506"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.372628 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aec8eff5-d865-4c47-acda-51609f6df4b6-scripts" (OuterVolumeSpecName: "scripts") pod "aec8eff5-d865-4c47-acda-51609f6df4b6" (UID: "aec8eff5-d865-4c47-acda-51609f6df4b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.377000 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aec8eff5-d865-4c47-acda-51609f6df4b6-config-data" (OuterVolumeSpecName: "config-data") pod "aec8eff5-d865-4c47-acda-51609f6df4b6" (UID: "aec8eff5-d865-4c47-acda-51609f6df4b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.379018 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec8eff5-d865-4c47-acda-51609f6df4b6-kube-api-access-j25d2" (OuterVolumeSpecName: "kube-api-access-j25d2") pod "aec8eff5-d865-4c47-acda-51609f6df4b6" (UID: "aec8eff5-d865-4c47-acda-51609f6df4b6"). InnerVolumeSpecName "kube-api-access-j25d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.379389 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6895574-7c6f-4a69-a821-8f6ce5b33506-logs" (OuterVolumeSpecName: "logs") pod "e6895574-7c6f-4a69-a821-8f6ce5b33506" (UID: "e6895574-7c6f-4a69-a821-8f6ce5b33506"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.379684 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "27996094-c657-451a-98fe-b960c1b88d31" (UID: "27996094-c657-451a-98fe-b960c1b88d31"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.383420 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6895574-7c6f-4a69-a821-8f6ce5b33506-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e6895574-7c6f-4a69-a821-8f6ce5b33506" (UID: "e6895574-7c6f-4a69-a821-8f6ce5b33506"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.383648 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27996094-c657-451a-98fe-b960c1b88d31-kube-api-access-gwlg6" (OuterVolumeSpecName: "kube-api-access-gwlg6") pod "27996094-c657-451a-98fe-b960c1b88d31" (UID: "27996094-c657-451a-98fe-b960c1b88d31"). InnerVolumeSpecName "kube-api-access-gwlg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.388826 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-scripts" (OuterVolumeSpecName: "scripts") pod "27996094-c657-451a-98fe-b960c1b88d31" (UID: "27996094-c657-451a-98fe-b960c1b88d31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.392094 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "27996094-c657-451a-98fe-b960c1b88d31" (UID: "27996094-c657-451a-98fe-b960c1b88d31"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.401556 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6895574-7c6f-4a69-a821-8f6ce5b33506-kube-api-access-8r2qg" (OuterVolumeSpecName: "kube-api-access-8r2qg") pod "e6895574-7c6f-4a69-a821-8f6ce5b33506" (UID: "e6895574-7c6f-4a69-a821-8f6ce5b33506"). InnerVolumeSpecName "kube-api-access-8r2qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.416750 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aec8eff5-d865-4c47-acda-51609f6df4b6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "aec8eff5-d865-4c47-acda-51609f6df4b6" (UID: "aec8eff5-d865-4c47-acda-51609f6df4b6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.445528 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27996094-c657-451a-98fe-b960c1b88d31" (UID: "27996094-c657-451a-98fe-b960c1b88d31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.450384 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-config-data" (OuterVolumeSpecName: "config-data") pod "27996094-c657-451a-98fe-b960c1b88d31" (UID: "27996094-c657-451a-98fe-b960c1b88d31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.473107 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6895574-7c6f-4a69-a821-8f6ce5b33506-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.473139 4580 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6895574-7c6f-4a69-a821-8f6ce5b33506-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.473151 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r2qg\" (UniqueName: \"kubernetes.io/projected/e6895574-7c6f-4a69-a821-8f6ce5b33506-kube-api-access-8r2qg\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.473160 4580 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aec8eff5-d865-4c47-acda-51609f6df4b6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.473168 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6895574-7c6f-4a69-a821-8f6ce5b33506-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.473177 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.473185 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aec8eff5-d865-4c47-acda-51609f6df4b6-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.473193 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j25d2\" (UniqueName: \"kubernetes.io/projected/aec8eff5-d865-4c47-acda-51609f6df4b6-kube-api-access-j25d2\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.473201 4580 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.473208 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aec8eff5-d865-4c47-acda-51609f6df4b6-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.473236 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6895574-7c6f-4a69-a821-8f6ce5b33506-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.473243 4580 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.473253 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwlg6\" (UniqueName: \"kubernetes.io/projected/27996094-c657-451a-98fe-b960c1b88d31-kube-api-access-gwlg6\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.473260 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:55 crc kubenswrapper[4580]: I0321 05:12:55.473268 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27996094-c657-451a-98fe-b960c1b88d31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.174272 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57f4c4598f-qtdw2" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.174300 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rrz6s" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.175273 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56877c58ff-zj7t6" Mar 21 05:12:56 crc kubenswrapper[4580]: E0321 05:12:56.182239 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-f2ksl" podUID="3bfbab08-aee7-43bf-9118-252682438c95" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.246384 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57f4c4598f-qtdw2"] Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.247202 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-57f4c4598f-qtdw2"] Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.293766 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-56877c58ff-zj7t6"] Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.305207 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-56877c58ff-zj7t6"] Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.327904 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rrz6s"] Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.334835 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rrz6s"] Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.421951 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2tfwn"] Mar 21 05:12:56 crc kubenswrapper[4580]: E0321 05:12:56.422539 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27996094-c657-451a-98fe-b960c1b88d31" containerName="keystone-bootstrap" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.422556 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="27996094-c657-451a-98fe-b960c1b88d31" containerName="keystone-bootstrap" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.422710 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="27996094-c657-451a-98fe-b960c1b88d31" containerName="keystone-bootstrap" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.423250 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.426203 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.427820 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.435044 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.435294 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.435882 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2tfwn"] Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.437030 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sqx2b" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.497376 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-config-data\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.497559 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-fernet-keys\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.497581 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-credential-keys\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.497600 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-scripts\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.497689 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-combined-ca-bundle\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.497710 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7ldk\" (UniqueName: \"kubernetes.io/projected/3391bccc-2f7a-469a-8166-5ce7169e9917-kube-api-access-b7ldk\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.600410 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-combined-ca-bundle\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.600479 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7ldk\" (UniqueName: \"kubernetes.io/projected/3391bccc-2f7a-469a-8166-5ce7169e9917-kube-api-access-b7ldk\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.600574 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-config-data\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.600628 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-fernet-keys\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.600645 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-credential-keys\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.600667 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-scripts\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.606516 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-scripts\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.606982 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-config-data\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.607604 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-combined-ca-bundle\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.608155 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-credential-keys\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.612469 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-fernet-keys\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.618908 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7ldk\" (UniqueName: \"kubernetes.io/projected/3391bccc-2f7a-469a-8166-5ce7169e9917-kube-api-access-b7ldk\") pod \"keystone-bootstrap-2tfwn\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:56 crc kubenswrapper[4580]: I0321 05:12:56.744064 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:12:57 crc kubenswrapper[4580]: I0321 05:12:57.309717 4580 scope.go:117] "RemoveContainer" containerID="3231a6ad006cb3b4b5b09cbcab62f71f5d2bcef3b7c14fc34a9085fc8cdeff20" Mar 21 05:12:57 crc kubenswrapper[4580]: I0321 05:12:57.627979 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27996094-c657-451a-98fe-b960c1b88d31" path="/var/lib/kubelet/pods/27996094-c657-451a-98fe-b960c1b88d31/volumes" Mar 21 05:12:57 crc kubenswrapper[4580]: I0321 05:12:57.629434 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec8eff5-d865-4c47-acda-51609f6df4b6" path="/var/lib/kubelet/pods/aec8eff5-d865-4c47-acda-51609f6df4b6/volumes" Mar 21 05:12:57 crc kubenswrapper[4580]: I0321 05:12:57.630057 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6895574-7c6f-4a69-a821-8f6ce5b33506" path="/var/lib/kubelet/pods/e6895574-7c6f-4a69-a821-8f6ce5b33506/volumes" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.232133 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-mtsm2" podUID="1bcfc888-3413-4eb9-a887-ef250a15962a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.365846 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.372379 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.425864 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e829eb3c-14a4-40d6-904e-483dbe3cb066-logs\") pod \"e829eb3c-14a4-40d6-904e-483dbe3cb066\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.425911 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z94fz\" (UniqueName: \"kubernetes.io/projected/e829eb3c-14a4-40d6-904e-483dbe3cb066-kube-api-access-z94fz\") pod \"e829eb3c-14a4-40d6-904e-483dbe3cb066\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.425969 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e829eb3c-14a4-40d6-904e-483dbe3cb066-horizon-secret-key\") pod \"e829eb3c-14a4-40d6-904e-483dbe3cb066\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.426054 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e829eb3c-14a4-40d6-904e-483dbe3cb066-config-data\") pod \"e829eb3c-14a4-40d6-904e-483dbe3cb066\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.426136 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-dns-svc\") pod \"1bcfc888-3413-4eb9-a887-ef250a15962a\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.426174 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78qs5\" (UniqueName: \"kubernetes.io/projected/1bcfc888-3413-4eb9-a887-ef250a15962a-kube-api-access-78qs5\") pod \"1bcfc888-3413-4eb9-a887-ef250a15962a\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.426235 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e829eb3c-14a4-40d6-904e-483dbe3cb066-scripts\") pod \"e829eb3c-14a4-40d6-904e-483dbe3cb066\" (UID: \"e829eb3c-14a4-40d6-904e-483dbe3cb066\") " Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.426256 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-ovsdbserver-nb\") pod \"1bcfc888-3413-4eb9-a887-ef250a15962a\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.426274 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-config\") pod \"1bcfc888-3413-4eb9-a887-ef250a15962a\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.426299 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-ovsdbserver-sb\") pod \"1bcfc888-3413-4eb9-a887-ef250a15962a\" (UID: \"1bcfc888-3413-4eb9-a887-ef250a15962a\") " Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.429080 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e829eb3c-14a4-40d6-904e-483dbe3cb066-scripts" (OuterVolumeSpecName: "scripts") pod "e829eb3c-14a4-40d6-904e-483dbe3cb066" (UID: "e829eb3c-14a4-40d6-904e-483dbe3cb066"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.440572 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e829eb3c-14a4-40d6-904e-483dbe3cb066-config-data" (OuterVolumeSpecName: "config-data") pod "e829eb3c-14a4-40d6-904e-483dbe3cb066" (UID: "e829eb3c-14a4-40d6-904e-483dbe3cb066"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.441476 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e829eb3c-14a4-40d6-904e-483dbe3cb066-logs" (OuterVolumeSpecName: "logs") pod "e829eb3c-14a4-40d6-904e-483dbe3cb066" (UID: "e829eb3c-14a4-40d6-904e-483dbe3cb066"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.441641 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bcfc888-3413-4eb9-a887-ef250a15962a-kube-api-access-78qs5" (OuterVolumeSpecName: "kube-api-access-78qs5") pod "1bcfc888-3413-4eb9-a887-ef250a15962a" (UID: "1bcfc888-3413-4eb9-a887-ef250a15962a"). InnerVolumeSpecName "kube-api-access-78qs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.443398 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e829eb3c-14a4-40d6-904e-483dbe3cb066-kube-api-access-z94fz" (OuterVolumeSpecName: "kube-api-access-z94fz") pod "e829eb3c-14a4-40d6-904e-483dbe3cb066" (UID: "e829eb3c-14a4-40d6-904e-483dbe3cb066"). InnerVolumeSpecName "kube-api-access-z94fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.503327 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e829eb3c-14a4-40d6-904e-483dbe3cb066-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e829eb3c-14a4-40d6-904e-483dbe3cb066" (UID: "e829eb3c-14a4-40d6-904e-483dbe3cb066"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.527905 4580 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e829eb3c-14a4-40d6-904e-483dbe3cb066-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.527941 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e829eb3c-14a4-40d6-904e-483dbe3cb066-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.527951 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78qs5\" (UniqueName: \"kubernetes.io/projected/1bcfc888-3413-4eb9-a887-ef250a15962a-kube-api-access-78qs5\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.527962 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e829eb3c-14a4-40d6-904e-483dbe3cb066-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.527971 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e829eb3c-14a4-40d6-904e-483dbe3cb066-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.527980 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z94fz\" (UniqueName: \"kubernetes.io/projected/e829eb3c-14a4-40d6-904e-483dbe3cb066-kube-api-access-z94fz\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.540709 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-config" (OuterVolumeSpecName: "config") pod "1bcfc888-3413-4eb9-a887-ef250a15962a" (UID: "1bcfc888-3413-4eb9-a887-ef250a15962a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.555983 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1bcfc888-3413-4eb9-a887-ef250a15962a" (UID: "1bcfc888-3413-4eb9-a887-ef250a15962a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.574107 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1bcfc888-3413-4eb9-a887-ef250a15962a" (UID: "1bcfc888-3413-4eb9-a887-ef250a15962a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.576072 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1bcfc888-3413-4eb9-a887-ef250a15962a" (UID: "1bcfc888-3413-4eb9-a887-ef250a15962a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.635141 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.635414 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.635650 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:03 crc kubenswrapper[4580]: I0321 05:13:03.635667 4580 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bcfc888-3413-4eb9-a887-ef250a15962a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:04 crc kubenswrapper[4580]: I0321 05:13:04.240559 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b5d99966f-s6s2j" event={"ID":"e829eb3c-14a4-40d6-904e-483dbe3cb066","Type":"ContainerDied","Data":"d79734d0e603cbaec138d59e662806c1033fd5a7b9db3aa5dc51f055061b5bbc"} Mar 21 05:13:04 crc kubenswrapper[4580]: I0321 05:13:04.240907 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b5d99966f-s6s2j" Mar 21 05:13:04 crc kubenswrapper[4580]: I0321 05:13:04.247261 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mtsm2" event={"ID":"1bcfc888-3413-4eb9-a887-ef250a15962a","Type":"ContainerDied","Data":"6ef9b81fd82b8fb915899af4c63d13838d3d530eb4270aeb7294cdb86622d777"} Mar 21 05:13:04 crc kubenswrapper[4580]: I0321 05:13:04.247380 4580 scope.go:117] "RemoveContainer" containerID="d34a243adbe1462703a007a8dac8774f1d988e5dc334028b7c8492fa49467735" Mar 21 05:13:04 crc kubenswrapper[4580]: I0321 05:13:04.247471 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mtsm2" Mar 21 05:13:04 crc kubenswrapper[4580]: I0321 05:13:04.317216 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b5d99966f-s6s2j"] Mar 21 05:13:04 crc kubenswrapper[4580]: I0321 05:13:04.328683 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b5d99966f-s6s2j"] Mar 21 05:13:04 crc kubenswrapper[4580]: I0321 05:13:04.337692 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mtsm2"] Mar 21 05:13:04 crc kubenswrapper[4580]: I0321 05:13:04.346536 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mtsm2"] Mar 21 05:13:05 crc kubenswrapper[4580]: I0321 05:13:05.626242 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bcfc888-3413-4eb9-a887-ef250a15962a" path="/var/lib/kubelet/pods/1bcfc888-3413-4eb9-a887-ef250a15962a/volumes" Mar 21 05:13:05 crc kubenswrapper[4580]: I0321 05:13:05.627325 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e829eb3c-14a4-40d6-904e-483dbe3cb066" path="/var/lib/kubelet/pods/e829eb3c-14a4-40d6-904e-483dbe3cb066/volumes" Mar 21 05:13:05 crc kubenswrapper[4580]: I0321 05:13:05.909377 4580 scope.go:117] "RemoveContainer" containerID="21567b14a9c45003f7fb9cd7948fd6e686a89c6c8ad543f92d66fa175f5d0053" Mar 21 05:13:06 crc kubenswrapper[4580]: E0321 05:13:06.123757 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 21 05:13:06 crc kubenswrapper[4580]: E0321 05:13:06.124213 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j6q57,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-m7rwg_openstack(55568564-a701-4c16-b5c4-617f88c364a5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:13:06 crc kubenswrapper[4580]: E0321 05:13:06.125297 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-m7rwg" podUID="55568564-a701-4c16-b5c4-617f88c364a5" Mar 21 05:13:06 crc kubenswrapper[4580]: E0321 05:13:06.309269 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-m7rwg" podUID="55568564-a701-4c16-b5c4-617f88c364a5" Mar 21 05:13:06 crc kubenswrapper[4580]: I0321 05:13:06.335274 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67655f8b6-mbx6n"] Mar 21 05:13:06 crc kubenswrapper[4580]: I0321 05:13:06.443062 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-587cfc8688-265kc"] Mar 21 05:13:06 crc kubenswrapper[4580]: W0321 05:13:06.636593 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda03ce0fa_f7e8_4b48_bbea_95807f14dd26.slice/crio-a86b34256c1b921bc933f130b957b076bb0092bb6db63eb6a077a85eadcb4a29 WatchSource:0}: Error finding container a86b34256c1b921bc933f130b957b076bb0092bb6db63eb6a077a85eadcb4a29: Status 404 returned error can't find the container with id a86b34256c1b921bc933f130b957b076bb0092bb6db63eb6a077a85eadcb4a29 Mar 21 05:13:06 crc kubenswrapper[4580]: W0321 05:13:06.639112 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08a0110f_428a_481d_b439_bc16e6837dc3.slice/crio-3bf837f5f84a7e9166c2c26039f4d5a0f408f7ddf6a3b2f18ae00b5fe9399791 WatchSource:0}: Error finding container 3bf837f5f84a7e9166c2c26039f4d5a0f408f7ddf6a3b2f18ae00b5fe9399791: Status 404 returned error can't find the container with id 3bf837f5f84a7e9166c2c26039f4d5a0f408f7ddf6a3b2f18ae00b5fe9399791 Mar 21 05:13:06 crc kubenswrapper[4580]: I0321 05:13:06.711394 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2tfwn"] Mar 21 05:13:06 crc kubenswrapper[4580]: W0321 05:13:06.734830 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3391bccc_2f7a_469a_8166_5ce7169e9917.slice/crio-638f73a12f2db71b0519d17410a25508c1263ee87f0ccdb962cda43ff662b532 WatchSource:0}: Error finding container 638f73a12f2db71b0519d17410a25508c1263ee87f0ccdb962cda43ff662b532: Status 404 returned error can't find the container with id 638f73a12f2db71b0519d17410a25508c1263ee87f0ccdb962cda43ff662b532 Mar 21 05:13:07 crc kubenswrapper[4580]: I0321 05:13:07.317269 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67655f8b6-mbx6n" event={"ID":"a03ce0fa-f7e8-4b48-bbea-95807f14dd26","Type":"ContainerStarted","Data":"a86b34256c1b921bc933f130b957b076bb0092bb6db63eb6a077a85eadcb4a29"} Mar 21 05:13:07 crc kubenswrapper[4580]: I0321 05:13:07.319620 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cgz9l" event={"ID":"2234c053-0318-4d03-8e9f-b9ee569529fc","Type":"ContainerStarted","Data":"8f80ca1e7bd3a43740a103e7a0d8280fd86914f170cee6de45d81273597cb752"} Mar 21 05:13:07 crc kubenswrapper[4580]: I0321 05:13:07.322897 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587cfc8688-265kc" event={"ID":"08a0110f-428a-481d-b439-bc16e6837dc3","Type":"ContainerStarted","Data":"3bf837f5f84a7e9166c2c26039f4d5a0f408f7ddf6a3b2f18ae00b5fe9399791"} Mar 21 05:13:07 crc kubenswrapper[4580]: I0321 05:13:07.328027 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c311e091-7cf1-426b-9788-a3d64b198e43","Type":"ContainerStarted","Data":"2ee092012d167f5b248abd488e907d058d7e1ad76a3a5daf34599a61962fa2b6"} Mar 21 05:13:07 crc kubenswrapper[4580]: I0321 05:13:07.332928 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2tfwn" event={"ID":"3391bccc-2f7a-469a-8166-5ce7169e9917","Type":"ContainerStarted","Data":"217ccbfd9b38375bf06fdf2171c10e2a95a0662f2f6a33b874445488a76b2f60"} Mar 21 05:13:07 crc kubenswrapper[4580]: I0321 05:13:07.332974 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2tfwn" event={"ID":"3391bccc-2f7a-469a-8166-5ce7169e9917","Type":"ContainerStarted","Data":"638f73a12f2db71b0519d17410a25508c1263ee87f0ccdb962cda43ff662b532"} Mar 21 05:13:07 crc kubenswrapper[4580]: I0321 05:13:07.361598 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-cgz9l" podStartSLOduration=3.149661843 podStartE2EDuration="46.361569045s" podCreationTimestamp="2026-03-21 05:12:21 +0000 UTC" firstStartedPulling="2026-03-21 05:12:23.457856448 +0000 UTC m=+1248.540440076" lastFinishedPulling="2026-03-21 05:13:06.66976365 +0000 UTC m=+1291.752347278" observedRunningTime="2026-03-21 05:13:07.338149782 +0000 UTC m=+1292.420733410" watchObservedRunningTime="2026-03-21 05:13:07.361569045 +0000 UTC m=+1292.444152683" Mar 21 05:13:07 crc kubenswrapper[4580]: I0321 05:13:07.383711 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2tfwn" podStartSLOduration=11.383682363 podStartE2EDuration="11.383682363s" podCreationTimestamp="2026-03-21 05:12:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:13:07.357317642 +0000 UTC m=+1292.439901280" watchObservedRunningTime="2026-03-21 05:13:07.383682363 +0000 UTC m=+1292.466265991" Mar 21 05:13:08 crc kubenswrapper[4580]: I0321 05:13:08.233647 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-mtsm2" podUID="1bcfc888-3413-4eb9-a887-ef250a15962a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Mar 21 05:13:08 crc kubenswrapper[4580]: I0321 05:13:08.347804 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67655f8b6-mbx6n" event={"ID":"a03ce0fa-f7e8-4b48-bbea-95807f14dd26","Type":"ContainerStarted","Data":"b03c7b6b3d34260bff0a00bc798a52da6836ea0ee76b7c6df6980b8c29af49eb"} Mar 21 05:13:08 crc kubenswrapper[4580]: I0321 05:13:08.347853 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67655f8b6-mbx6n" event={"ID":"a03ce0fa-f7e8-4b48-bbea-95807f14dd26","Type":"ContainerStarted","Data":"8f60185607aa66ea61c1f323959ec37f2c1ea379396e9102352b21fda7b47178"} Mar 21 05:13:08 crc kubenswrapper[4580]: I0321 05:13:08.352491 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587cfc8688-265kc" event={"ID":"08a0110f-428a-481d-b439-bc16e6837dc3","Type":"ContainerStarted","Data":"19fb47284615f4db4d3ee3b8a1bb2963d50724cdbe63d92f0b19442506b6bf5b"} Mar 21 05:13:08 crc kubenswrapper[4580]: I0321 05:13:08.352536 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587cfc8688-265kc" event={"ID":"08a0110f-428a-481d-b439-bc16e6837dc3","Type":"ContainerStarted","Data":"7573b50ebc5ac682fcca653fb89d61d20bbd5e002d97c910776fa487a5d85059"} Mar 21 05:13:08 crc kubenswrapper[4580]: I0321 05:13:08.378730 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67655f8b6-mbx6n" podStartSLOduration=36.879559628 podStartE2EDuration="37.37870434s" podCreationTimestamp="2026-03-21 05:12:31 +0000 UTC" firstStartedPulling="2026-03-21 05:13:06.648240618 +0000 UTC m=+1291.730824246" lastFinishedPulling="2026-03-21 05:13:07.14738533 +0000 UTC m=+1292.229968958" observedRunningTime="2026-03-21 05:13:08.368724375 +0000 UTC m=+1293.451308013" watchObservedRunningTime="2026-03-21 05:13:08.37870434 +0000 UTC m=+1293.461287968" Mar 21 05:13:08 crc kubenswrapper[4580]: I0321 05:13:08.409966 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-587cfc8688-265kc" podStartSLOduration=36.912078293 podStartE2EDuration="37.409948211s" podCreationTimestamp="2026-03-21 05:12:31 +0000 UTC" firstStartedPulling="2026-03-21 05:13:06.64831094 +0000 UTC m=+1291.730894568" lastFinishedPulling="2026-03-21 05:13:07.146180858 +0000 UTC m=+1292.228764486" observedRunningTime="2026-03-21 05:13:08.399206045 +0000 UTC m=+1293.481789683" watchObservedRunningTime="2026-03-21 05:13:08.409948211 +0000 UTC m=+1293.492531839" Mar 21 05:13:09 crc kubenswrapper[4580]: I0321 05:13:09.361920 4580 generic.go:334] "Generic (PLEG): container finished" podID="f86c26eb-bb22-460c-8a45-191a02924112" containerID="3ff385e1bbcee324592b75b1425bea21d93e458294c849c13f312dc58af9311a" exitCode=0 Mar 21 05:13:09 crc kubenswrapper[4580]: I0321 05:13:09.362064 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n4vzq" event={"ID":"f86c26eb-bb22-460c-8a45-191a02924112","Type":"ContainerDied","Data":"3ff385e1bbcee324592b75b1425bea21d93e458294c849c13f312dc58af9311a"} Mar 21 05:13:10 crc kubenswrapper[4580]: I0321 05:13:10.872913 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n4vzq" Mar 21 05:13:10 crc kubenswrapper[4580]: I0321 05:13:10.984540 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-combined-ca-bundle\") pod \"f86c26eb-bb22-460c-8a45-191a02924112\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " Mar 21 05:13:10 crc kubenswrapper[4580]: I0321 05:13:10.984620 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-db-sync-config-data\") pod \"f86c26eb-bb22-460c-8a45-191a02924112\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " Mar 21 05:13:10 crc kubenswrapper[4580]: I0321 05:13:10.984649 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-config-data\") pod \"f86c26eb-bb22-460c-8a45-191a02924112\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " Mar 21 05:13:10 crc kubenswrapper[4580]: I0321 05:13:10.984695 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h95zr\" (UniqueName: \"kubernetes.io/projected/f86c26eb-bb22-460c-8a45-191a02924112-kube-api-access-h95zr\") pod \"f86c26eb-bb22-460c-8a45-191a02924112\" (UID: \"f86c26eb-bb22-460c-8a45-191a02924112\") " Mar 21 05:13:11 crc kubenswrapper[4580]: I0321 05:13:11.004957 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f86c26eb-bb22-460c-8a45-191a02924112" (UID: "f86c26eb-bb22-460c-8a45-191a02924112"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:11 crc kubenswrapper[4580]: I0321 05:13:11.006696 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f86c26eb-bb22-460c-8a45-191a02924112-kube-api-access-h95zr" (OuterVolumeSpecName: "kube-api-access-h95zr") pod "f86c26eb-bb22-460c-8a45-191a02924112" (UID: "f86c26eb-bb22-460c-8a45-191a02924112"). InnerVolumeSpecName "kube-api-access-h95zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:11 crc kubenswrapper[4580]: I0321 05:13:11.032873 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f86c26eb-bb22-460c-8a45-191a02924112" (UID: "f86c26eb-bb22-460c-8a45-191a02924112"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:11 crc kubenswrapper[4580]: I0321 05:13:11.051193 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-config-data" (OuterVolumeSpecName: "config-data") pod "f86c26eb-bb22-460c-8a45-191a02924112" (UID: "f86c26eb-bb22-460c-8a45-191a02924112"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:11 crc kubenswrapper[4580]: I0321 05:13:11.086667 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:11 crc kubenswrapper[4580]: I0321 05:13:11.086919 4580 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:11 crc kubenswrapper[4580]: I0321 05:13:11.087035 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86c26eb-bb22-460c-8a45-191a02924112-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:11 crc kubenswrapper[4580]: I0321 05:13:11.087121 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h95zr\" (UniqueName: \"kubernetes.io/projected/f86c26eb-bb22-460c-8a45-191a02924112-kube-api-access-h95zr\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:11 crc kubenswrapper[4580]: I0321 05:13:11.382612 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n4vzq" event={"ID":"f86c26eb-bb22-460c-8a45-191a02924112","Type":"ContainerDied","Data":"c811f7c0d3e27ca679edddcfe96806d12a390e73a0bf11a30d4778cf3907d7f2"} Mar 21 05:13:11 crc kubenswrapper[4580]: I0321 05:13:11.383074 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c811f7c0d3e27ca679edddcfe96806d12a390e73a0bf11a30d4778cf3907d7f2" Mar 21 05:13:11 crc kubenswrapper[4580]: I0321 05:13:11.383180 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n4vzq" Mar 21 05:13:11 crc kubenswrapper[4580]: I0321 05:13:11.401121 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:13:11 crc kubenswrapper[4580]: I0321 05:13:11.409947 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:13:11 crc kubenswrapper[4580]: I0321 05:13:11.506773 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:13:11 crc kubenswrapper[4580]: I0321 05:13:11.507518 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.326847 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gfzjq"] Mar 21 05:13:12 crc kubenswrapper[4580]: E0321 05:13:12.327250 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bcfc888-3413-4eb9-a887-ef250a15962a" containerName="init" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.327263 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bcfc888-3413-4eb9-a887-ef250a15962a" containerName="init" Mar 21 05:13:12 crc kubenswrapper[4580]: E0321 05:13:12.327283 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bcfc888-3413-4eb9-a887-ef250a15962a" containerName="dnsmasq-dns" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.327289 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bcfc888-3413-4eb9-a887-ef250a15962a" containerName="dnsmasq-dns" Mar 21 05:13:12 crc kubenswrapper[4580]: E0321 05:13:12.327314 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86c26eb-bb22-460c-8a45-191a02924112" containerName="glance-db-sync" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.327320 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86c26eb-bb22-460c-8a45-191a02924112" containerName="glance-db-sync" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.327470 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f86c26eb-bb22-460c-8a45-191a02924112" containerName="glance-db-sync" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.327482 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bcfc888-3413-4eb9-a887-ef250a15962a" containerName="dnsmasq-dns" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.328340 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.349105 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gfzjq"] Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.427651 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.427742 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87sj9\" (UniqueName: \"kubernetes.io/projected/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-kube-api-access-87sj9\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.427837 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-config\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.427900 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.427994 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.428025 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.529223 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-config\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.529280 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.529330 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.529354 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.529373 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.529412 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87sj9\" (UniqueName: \"kubernetes.io/projected/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-kube-api-access-87sj9\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.531298 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-config\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.531809 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.532027 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.532605 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.533276 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.560729 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87sj9\" (UniqueName: \"kubernetes.io/projected/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-kube-api-access-87sj9\") pod \"dnsmasq-dns-785d8bcb8c-gfzjq\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:12 crc kubenswrapper[4580]: I0321 05:13:12.677345 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.152873 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.154866 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.159574 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-65qpr" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.159929 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.159947 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.209020 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.243122 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.243409 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.243493 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.243545 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j79dq\" (UniqueName: \"kubernetes.io/projected/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-kube-api-access-j79dq\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.243707 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-logs\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.243851 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-scripts\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.244113 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-config-data\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.345662 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.345713 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.345740 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j79dq\" (UniqueName: \"kubernetes.io/projected/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-kube-api-access-j79dq\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.345804 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-logs\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.345852 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-scripts\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.345902 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-config-data\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.345945 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.346327 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.346572 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.347375 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-logs\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.353572 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-scripts\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.355767 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-config-data\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.359603 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.372992 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j79dq\" (UniqueName: \"kubernetes.io/projected/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-kube-api-access-j79dq\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.392272 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.479319 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.525012 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.526863 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.531700 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.536039 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.548702 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-config-data\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.548803 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-scripts\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.548901 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.549048 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94084a3b-f944-4e21-a003-a9fee80c7248-logs\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.549092 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnqjw\" (UniqueName: \"kubernetes.io/projected/94084a3b-f944-4e21-a003-a9fee80c7248-kube-api-access-lnqjw\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.549194 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94084a3b-f944-4e21-a003-a9fee80c7248-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.549287 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.651122 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.651187 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-config-data\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.651271 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-scripts\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.651301 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.651355 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94084a3b-f944-4e21-a003-a9fee80c7248-logs\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.651375 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnqjw\" (UniqueName: \"kubernetes.io/projected/94084a3b-f944-4e21-a003-a9fee80c7248-kube-api-access-lnqjw\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.651438 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94084a3b-f944-4e21-a003-a9fee80c7248-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.651956 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94084a3b-f944-4e21-a003-a9fee80c7248-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.652971 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.665932 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94084a3b-f944-4e21-a003-a9fee80c7248-logs\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.669358 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-config-data\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.669936 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.677719 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnqjw\" (UniqueName: \"kubernetes.io/projected/94084a3b-f944-4e21-a003-a9fee80c7248-kube-api-access-lnqjw\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.678495 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-scripts\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.691882 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:13 crc kubenswrapper[4580]: I0321 05:13:13.883307 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:15 crc kubenswrapper[4580]: I0321 05:13:15.437413 4580 generic.go:334] "Generic (PLEG): container finished" podID="2234c053-0318-4d03-8e9f-b9ee569529fc" containerID="8f80ca1e7bd3a43740a103e7a0d8280fd86914f170cee6de45d81273597cb752" exitCode=0 Mar 21 05:13:15 crc kubenswrapper[4580]: I0321 05:13:15.437492 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cgz9l" event={"ID":"2234c053-0318-4d03-8e9f-b9ee569529fc","Type":"ContainerDied","Data":"8f80ca1e7bd3a43740a103e7a0d8280fd86914f170cee6de45d81273597cb752"} Mar 21 05:13:15 crc kubenswrapper[4580]: I0321 05:13:15.668243 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:13:15 crc kubenswrapper[4580]: I0321 05:13:15.764591 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.710295 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cgz9l" Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.830277 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-scripts\") pod \"2234c053-0318-4d03-8e9f-b9ee569529fc\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.830614 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-combined-ca-bundle\") pod \"2234c053-0318-4d03-8e9f-b9ee569529fc\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.830684 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-config-data\") pod \"2234c053-0318-4d03-8e9f-b9ee569529fc\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.830743 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d6h4\" (UniqueName: \"kubernetes.io/projected/2234c053-0318-4d03-8e9f-b9ee569529fc-kube-api-access-5d6h4\") pod \"2234c053-0318-4d03-8e9f-b9ee569529fc\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.830765 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2234c053-0318-4d03-8e9f-b9ee569529fc-logs\") pod \"2234c053-0318-4d03-8e9f-b9ee569529fc\" (UID: \"2234c053-0318-4d03-8e9f-b9ee569529fc\") " Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.831441 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2234c053-0318-4d03-8e9f-b9ee569529fc-logs" (OuterVolumeSpecName: "logs") pod "2234c053-0318-4d03-8e9f-b9ee569529fc" (UID: "2234c053-0318-4d03-8e9f-b9ee569529fc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.843896 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-scripts" (OuterVolumeSpecName: "scripts") pod "2234c053-0318-4d03-8e9f-b9ee569529fc" (UID: "2234c053-0318-4d03-8e9f-b9ee569529fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.844440 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2234c053-0318-4d03-8e9f-b9ee569529fc-kube-api-access-5d6h4" (OuterVolumeSpecName: "kube-api-access-5d6h4") pod "2234c053-0318-4d03-8e9f-b9ee569529fc" (UID: "2234c053-0318-4d03-8e9f-b9ee569529fc"). InnerVolumeSpecName "kube-api-access-5d6h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.887876 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2234c053-0318-4d03-8e9f-b9ee569529fc" (UID: "2234c053-0318-4d03-8e9f-b9ee569529fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.890872 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-config-data" (OuterVolumeSpecName: "config-data") pod "2234c053-0318-4d03-8e9f-b9ee569529fc" (UID: "2234c053-0318-4d03-8e9f-b9ee569529fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.933757 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.933803 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.933814 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2234c053-0318-4d03-8e9f-b9ee569529fc-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.933825 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d6h4\" (UniqueName: \"kubernetes.io/projected/2234c053-0318-4d03-8e9f-b9ee569529fc-kube-api-access-5d6h4\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:17 crc kubenswrapper[4580]: I0321 05:13:17.933835 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2234c053-0318-4d03-8e9f-b9ee569529fc-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.218800 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.258346 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gfzjq"] Mar 21 05:13:18 crc kubenswrapper[4580]: W0321 05:13:18.275574 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod164fdcfb_ffa0_4152_b9e8_d3f29c16090c.slice/crio-c329d6206f4e3e108b2f1bcccb07b5694138c8084bbd91a2b0ac850bab3f72c0 WatchSource:0}: Error finding container c329d6206f4e3e108b2f1bcccb07b5694138c8084bbd91a2b0ac850bab3f72c0: Status 404 returned error can't find the container with id c329d6206f4e3e108b2f1bcccb07b5694138c8084bbd91a2b0ac850bab3f72c0 Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.341049 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.480032 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"94084a3b-f944-4e21-a003-a9fee80c7248","Type":"ContainerStarted","Data":"9ebfac552a9bd47a710304fadd2dece44d1f72c20bef12a277b8c9a9a0707a43"} Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.485342 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" event={"ID":"164fdcfb-ffa0-4152-b9e8-d3f29c16090c","Type":"ContainerStarted","Data":"c329d6206f4e3e108b2f1bcccb07b5694138c8084bbd91a2b0ac850bab3f72c0"} Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.487622 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cgz9l" event={"ID":"2234c053-0318-4d03-8e9f-b9ee569529fc","Type":"ContainerDied","Data":"71e1c43ecf211a081a5346423cfe9b1d5914d19409029eb95667bb92169b8fc5"} Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.487644 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71e1c43ecf211a081a5346423cfe9b1d5914d19409029eb95667bb92169b8fc5" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.487694 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cgz9l" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.491583 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5","Type":"ContainerStarted","Data":"644a1e593153611cb7519e9243f9a0c78ee803e72d27da94333c6de8f52d4b46"} Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.837543 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-55886d54c6-b2qbq"] Mar 21 05:13:18 crc kubenswrapper[4580]: E0321 05:13:18.837889 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2234c053-0318-4d03-8e9f-b9ee569529fc" containerName="placement-db-sync" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.837901 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2234c053-0318-4d03-8e9f-b9ee569529fc" containerName="placement-db-sync" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.838060 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2234c053-0318-4d03-8e9f-b9ee569529fc" containerName="placement-db-sync" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.839367 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.852062 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.852549 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.852674 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.853120 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.853739 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6mvvp" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.869345 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55886d54c6-b2qbq"] Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.967331 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-combined-ca-bundle\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.967620 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db2cc\" (UniqueName: \"kubernetes.io/projected/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-kube-api-access-db2cc\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.967694 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-internal-tls-certs\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.967714 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-logs\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.967738 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-scripts\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.967769 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-public-tls-certs\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:18 crc kubenswrapper[4580]: I0321 05:13:18.967831 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-config-data\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.069516 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-scripts\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.069754 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-public-tls-certs\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.069909 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-config-data\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.070028 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-combined-ca-bundle\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.070102 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db2cc\" (UniqueName: \"kubernetes.io/projected/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-kube-api-access-db2cc\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.070215 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-internal-tls-certs\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.070286 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-logs\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.070668 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-logs\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.077003 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-public-tls-certs\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.078209 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-scripts\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.085631 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-config-data\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.086269 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-combined-ca-bundle\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.090369 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db2cc\" (UniqueName: \"kubernetes.io/projected/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-kube-api-access-db2cc\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.101502 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-internal-tls-certs\") pod \"placement-55886d54c6-b2qbq\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.185449 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.577675 4580 generic.go:334] "Generic (PLEG): container finished" podID="3391bccc-2f7a-469a-8166-5ce7169e9917" containerID="217ccbfd9b38375bf06fdf2171c10e2a95a0662f2f6a33b874445488a76b2f60" exitCode=0 Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.577715 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2tfwn" event={"ID":"3391bccc-2f7a-469a-8166-5ce7169e9917","Type":"ContainerDied","Data":"217ccbfd9b38375bf06fdf2171c10e2a95a0662f2f6a33b874445488a76b2f60"} Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.585370 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5","Type":"ContainerStarted","Data":"befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634"} Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.680535 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"94084a3b-f944-4e21-a003-a9fee80c7248","Type":"ContainerStarted","Data":"377df9769c8e88363355633f472d9098242ec6c99c843f0d3e4be346e403468e"} Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.694647 4580 generic.go:334] "Generic (PLEG): container finished" podID="164fdcfb-ffa0-4152-b9e8-d3f29c16090c" containerID="928f56c5c23f263103d35fb9a339d137c928c02e81b63ac85da13955b63e8fe3" exitCode=0 Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.694708 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" event={"ID":"164fdcfb-ffa0-4152-b9e8-d3f29c16090c","Type":"ContainerDied","Data":"928f56c5c23f263103d35fb9a339d137c928c02e81b63ac85da13955b63e8fe3"} Mar 21 05:13:19 crc kubenswrapper[4580]: I0321 05:13:19.957202 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55886d54c6-b2qbq"] Mar 21 05:13:20 crc kubenswrapper[4580]: W0321 05:13:20.035245 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb91b8cf1_cf3f_4789_b232_03aeea8f47ef.slice/crio-fe6857fa07e803052bfcb06c7c1f015220a0fbb02e6d02a75107838425b2f8a1 WatchSource:0}: Error finding container fe6857fa07e803052bfcb06c7c1f015220a0fbb02e6d02a75107838425b2f8a1: Status 404 returned error can't find the container with id fe6857fa07e803052bfcb06c7c1f015220a0fbb02e6d02a75107838425b2f8a1 Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.746242 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f2ksl" event={"ID":"3bfbab08-aee7-43bf-9118-252682438c95","Type":"ContainerStarted","Data":"cf1fbd9cf540e60eee221e084401254071b0f33e589ff7e0ba9e0bd010cc35ad"} Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.749712 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"94084a3b-f944-4e21-a003-a9fee80c7248","Type":"ContainerStarted","Data":"4a0ed2e5f8b3f5e4c5a389cd8ec9df93da3e9997523273906c66c8856504acff"} Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.749762 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="94084a3b-f944-4e21-a003-a9fee80c7248" containerName="glance-log" containerID="cri-o://377df9769c8e88363355633f472d9098242ec6c99c843f0d3e4be346e403468e" gracePeriod=30 Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.749834 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="94084a3b-f944-4e21-a003-a9fee80c7248" containerName="glance-httpd" containerID="cri-o://4a0ed2e5f8b3f5e4c5a389cd8ec9df93da3e9997523273906c66c8856504acff" gracePeriod=30 Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.752694 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" event={"ID":"164fdcfb-ffa0-4152-b9e8-d3f29c16090c","Type":"ContainerStarted","Data":"a4d3a88bdacd15870c94059cd5a3ac2c28c44040866ad4019864559db267a928"} Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.752961 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.761962 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c311e091-7cf1-426b-9788-a3d64b198e43","Type":"ContainerStarted","Data":"79b346aa629f5e0c52009561345e629f878a979fb3866889dc45c60a0a077a68"} Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.769618 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55886d54c6-b2qbq" event={"ID":"b91b8cf1-cf3f-4789-b232-03aeea8f47ef","Type":"ContainerStarted","Data":"e8eb77b909e1c1f8309a03f5e03250e60939007e839e0a195f74647abdf86540"} Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.769659 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55886d54c6-b2qbq" event={"ID":"b91b8cf1-cf3f-4789-b232-03aeea8f47ef","Type":"ContainerStarted","Data":"fe6857fa07e803052bfcb06c7c1f015220a0fbb02e6d02a75107838425b2f8a1"} Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.776505 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" containerName="glance-log" containerID="cri-o://befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634" gracePeriod=30 Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.777595 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5","Type":"ContainerStarted","Data":"26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4"} Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.777656 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" containerName="glance-httpd" containerID="cri-o://26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4" gracePeriod=30 Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.809860 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" podStartSLOduration=8.809727144 podStartE2EDuration="8.809727144s" podCreationTimestamp="2026-03-21 05:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:13:20.799460031 +0000 UTC m=+1305.882043679" watchObservedRunningTime="2026-03-21 05:13:20.809727144 +0000 UTC m=+1305.892310772" Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.813002 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-f2ksl" podStartSLOduration=4.241765901 podStartE2EDuration="59.812980141s" podCreationTimestamp="2026-03-21 05:12:21 +0000 UTC" firstStartedPulling="2026-03-21 05:12:23.572579059 +0000 UTC m=+1248.655162687" lastFinishedPulling="2026-03-21 05:13:19.143793309 +0000 UTC m=+1304.226376927" observedRunningTime="2026-03-21 05:13:20.768415796 +0000 UTC m=+1305.850999434" watchObservedRunningTime="2026-03-21 05:13:20.812980141 +0000 UTC m=+1305.895563769" Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.856215 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.85619375 podStartE2EDuration="8.85619375s" podCreationTimestamp="2026-03-21 05:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:13:20.845243229 +0000 UTC m=+1305.927826867" watchObservedRunningTime="2026-03-21 05:13:20.85619375 +0000 UTC m=+1305.938777398" Mar 21 05:13:20 crc kubenswrapper[4580]: I0321 05:13:20.887277 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.887257636 podStartE2EDuration="8.887257636s" podCreationTimestamp="2026-03-21 05:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:13:20.878873123 +0000 UTC m=+1305.961456771" watchObservedRunningTime="2026-03-21 05:13:20.887257636 +0000 UTC m=+1305.969841264" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.305917 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.398558 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.429730 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-combined-ca-bundle\") pod \"3391bccc-2f7a-469a-8166-5ce7169e9917\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.429804 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-scripts\") pod \"3391bccc-2f7a-469a-8166-5ce7169e9917\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.429831 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-credential-keys\") pod \"3391bccc-2f7a-469a-8166-5ce7169e9917\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.429908 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-fernet-keys\") pod \"3391bccc-2f7a-469a-8166-5ce7169e9917\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.430036 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-config-data\") pod \"3391bccc-2f7a-469a-8166-5ce7169e9917\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.430073 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7ldk\" (UniqueName: \"kubernetes.io/projected/3391bccc-2f7a-469a-8166-5ce7169e9917-kube-api-access-b7ldk\") pod \"3391bccc-2f7a-469a-8166-5ce7169e9917\" (UID: \"3391bccc-2f7a-469a-8166-5ce7169e9917\") " Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.438321 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-scripts" (OuterVolumeSpecName: "scripts") pod "3391bccc-2f7a-469a-8166-5ce7169e9917" (UID: "3391bccc-2f7a-469a-8166-5ce7169e9917"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.439095 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3391bccc-2f7a-469a-8166-5ce7169e9917" (UID: "3391bccc-2f7a-469a-8166-5ce7169e9917"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.441013 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3391bccc-2f7a-469a-8166-5ce7169e9917" (UID: "3391bccc-2f7a-469a-8166-5ce7169e9917"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.443979 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3391bccc-2f7a-469a-8166-5ce7169e9917-kube-api-access-b7ldk" (OuterVolumeSpecName: "kube-api-access-b7ldk") pod "3391bccc-2f7a-469a-8166-5ce7169e9917" (UID: "3391bccc-2f7a-469a-8166-5ce7169e9917"). InnerVolumeSpecName "kube-api-access-b7ldk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.467957 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3391bccc-2f7a-469a-8166-5ce7169e9917" (UID: "3391bccc-2f7a-469a-8166-5ce7169e9917"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.511270 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.532337 4580 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.532368 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7ldk\" (UniqueName: \"kubernetes.io/projected/3391bccc-2f7a-469a-8166-5ce7169e9917-kube-api-access-b7ldk\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.532379 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.532391 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.532400 4580 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.534700 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-config-data" (OuterVolumeSpecName: "config-data") pod "3391bccc-2f7a-469a-8166-5ce7169e9917" (UID: "3391bccc-2f7a-469a-8166-5ce7169e9917"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.601201 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.641097 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3391bccc-2f7a-469a-8166-5ce7169e9917-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.714023 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6d45658b5d-dfjj4"] Mar 21 05:13:21 crc kubenswrapper[4580]: E0321 05:13:21.714412 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" containerName="glance-log" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.714424 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" containerName="glance-log" Mar 21 05:13:21 crc kubenswrapper[4580]: E0321 05:13:21.714441 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3391bccc-2f7a-469a-8166-5ce7169e9917" containerName="keystone-bootstrap" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.714447 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3391bccc-2f7a-469a-8166-5ce7169e9917" containerName="keystone-bootstrap" Mar 21 05:13:21 crc kubenswrapper[4580]: E0321 05:13:21.714454 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" containerName="glance-httpd" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.714460 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" containerName="glance-httpd" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.714631 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" containerName="glance-httpd" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.714653 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="3391bccc-2f7a-469a-8166-5ce7169e9917" containerName="keystone-bootstrap" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.714666 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" containerName="glance-log" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.715216 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.730663 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.730906 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.742097 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-logs\") pod \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.742215 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-config-data\") pod \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.742241 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-httpd-run\") pod \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.742268 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.742328 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j79dq\" (UniqueName: \"kubernetes.io/projected/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-kube-api-access-j79dq\") pod \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.742391 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-combined-ca-bundle\") pod \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.742412 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-scripts\") pod \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\" (UID: \"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5\") " Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.743199 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" (UID: "7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.743348 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-logs" (OuterVolumeSpecName: "logs") pod "7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" (UID: "7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.752243 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" (UID: "7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.753921 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-scripts" (OuterVolumeSpecName: "scripts") pod "7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" (UID: "7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.756137 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d45658b5d-dfjj4"] Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.809584 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-kube-api-access-j79dq" (OuterVolumeSpecName: "kube-api-access-j79dq") pod "7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" (UID: "7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5"). InnerVolumeSpecName "kube-api-access-j79dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.815002 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" (UID: "7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.838977 4580 generic.go:334] "Generic (PLEG): container finished" podID="7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" containerID="26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4" exitCode=143 Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.839008 4580 generic.go:334] "Generic (PLEG): container finished" podID="7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" containerID="befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634" exitCode=143 Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.839047 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5","Type":"ContainerDied","Data":"26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4"} Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.839076 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5","Type":"ContainerDied","Data":"befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634"} Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.839086 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5","Type":"ContainerDied","Data":"644a1e593153611cb7519e9243f9a0c78ee803e72d27da94333c6de8f52d4b46"} Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.839100 4580 scope.go:117] "RemoveContainer" containerID="26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.839218 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.859033 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-combined-ca-bundle\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.859069 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25ss5\" (UniqueName: \"kubernetes.io/projected/19491f31-c899-4d84-a81b-262d0660b2c1-kube-api-access-25ss5\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.859123 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-config-data\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.859163 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-credential-keys\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.859181 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-fernet-keys\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.859253 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-public-tls-certs\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.859304 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-scripts\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.859324 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-internal-tls-certs\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.859385 4580 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.859398 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j79dq\" (UniqueName: \"kubernetes.io/projected/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-kube-api-access-j79dq\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.859408 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.859418 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.859426 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.859436 4580 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.867359 4580 generic.go:334] "Generic (PLEG): container finished" podID="94084a3b-f944-4e21-a003-a9fee80c7248" containerID="4a0ed2e5f8b3f5e4c5a389cd8ec9df93da3e9997523273906c66c8856504acff" exitCode=143 Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.867421 4580 generic.go:334] "Generic (PLEG): container finished" podID="94084a3b-f944-4e21-a003-a9fee80c7248" containerID="377df9769c8e88363355633f472d9098242ec6c99c843f0d3e4be346e403468e" exitCode=143 Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.867515 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"94084a3b-f944-4e21-a003-a9fee80c7248","Type":"ContainerDied","Data":"4a0ed2e5f8b3f5e4c5a389cd8ec9df93da3e9997523273906c66c8856504acff"} Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.867548 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"94084a3b-f944-4e21-a003-a9fee80c7248","Type":"ContainerDied","Data":"377df9769c8e88363355633f472d9098242ec6c99c843f0d3e4be346e403468e"} Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.870915 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m7rwg" event={"ID":"55568564-a701-4c16-b5c4-617f88c364a5","Type":"ContainerStarted","Data":"74309c1e8a8be97da322068060f5c3c7724a07d0be54c0e9b9f1e700809e5b3b"} Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.883484 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55886d54c6-b2qbq" event={"ID":"b91b8cf1-cf3f-4789-b232-03aeea8f47ef","Type":"ContainerStarted","Data":"f847a856383c9da180c3eb5c0b37dc8b9361946501a3cd575d400fbbf48a0b86"} Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.884049 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.884307 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.892745 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-m7rwg" podStartSLOduration=4.380822269 podStartE2EDuration="1m0.892727781s" podCreationTimestamp="2026-03-21 05:12:21 +0000 UTC" firstStartedPulling="2026-03-21 05:12:23.426082603 +0000 UTC m=+1248.508666231" lastFinishedPulling="2026-03-21 05:13:19.937988125 +0000 UTC m=+1305.020571743" observedRunningTime="2026-03-21 05:13:21.891773566 +0000 UTC m=+1306.974357214" watchObservedRunningTime="2026-03-21 05:13:21.892727781 +0000 UTC m=+1306.975311409" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.895817 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2tfwn" event={"ID":"3391bccc-2f7a-469a-8166-5ce7169e9917","Type":"ContainerDied","Data":"638f73a12f2db71b0519d17410a25508c1263ee87f0ccdb962cda43ff662b532"} Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.895887 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="638f73a12f2db71b0519d17410a25508c1263ee87f0ccdb962cda43ff662b532" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.895910 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2tfwn" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.907303 4580 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.927426 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-config-data" (OuterVolumeSpecName: "config-data") pod "7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" (UID: "7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.961696 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25ss5\" (UniqueName: \"kubernetes.io/projected/19491f31-c899-4d84-a81b-262d0660b2c1-kube-api-access-25ss5\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.961733 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-combined-ca-bundle\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.961796 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-config-data\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.961830 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-credential-keys\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.961851 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-fernet-keys\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.961928 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-public-tls-certs\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.961986 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-scripts\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.962004 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-internal-tls-certs\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.962069 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.962086 4580 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.969025 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-internal-tls-certs\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.976398 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-combined-ca-bundle\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.977064 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-public-tls-certs\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.977608 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-config-data\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.977873 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-scripts\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.978180 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-credential-keys\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.978640 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/19491f31-c899-4d84-a81b-262d0660b2c1-fernet-keys\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.984773 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25ss5\" (UniqueName: \"kubernetes.io/projected/19491f31-c899-4d84-a81b-262d0660b2c1-kube-api-access-25ss5\") pod \"keystone-6d45658b5d-dfjj4\" (UID: \"19491f31-c899-4d84-a81b-262d0660b2c1\") " pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:21 crc kubenswrapper[4580]: I0321 05:13:21.994509 4580 scope.go:117] "RemoveContainer" containerID="befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.057845 4580 scope.go:117] "RemoveContainer" containerID="26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4" Mar 21 05:13:22 crc kubenswrapper[4580]: E0321 05:13:22.059952 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4\": container with ID starting with 26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4 not found: ID does not exist" containerID="26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.059996 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4"} err="failed to get container status \"26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4\": rpc error: code = NotFound desc = could not find container \"26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4\": container with ID starting with 26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4 not found: ID does not exist" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.060028 4580 scope.go:117] "RemoveContainer" containerID="befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634" Mar 21 05:13:22 crc kubenswrapper[4580]: E0321 05:13:22.061615 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634\": container with ID starting with befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634 not found: ID does not exist" containerID="befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.061650 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634"} err="failed to get container status \"befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634\": rpc error: code = NotFound desc = could not find container \"befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634\": container with ID starting with befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634 not found: ID does not exist" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.061665 4580 scope.go:117] "RemoveContainer" containerID="26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.062002 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4"} err="failed to get container status \"26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4\": rpc error: code = NotFound desc = could not find container \"26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4\": container with ID starting with 26017963bd6cf2c81c603b1ff07c22cc0071de71bfe1eb9e654a990f7b7809c4 not found: ID does not exist" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.062016 4580 scope.go:117] "RemoveContainer" containerID="befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.062302 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634"} err="failed to get container status \"befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634\": rpc error: code = NotFound desc = could not find container \"befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634\": container with ID starting with befa33261dbdd22e38b5d844b9135eb078a951683fc2380542c083ad050f4634 not found: ID does not exist" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.192584 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.209018 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-55886d54c6-b2qbq" podStartSLOduration=4.208994861 podStartE2EDuration="4.208994861s" podCreationTimestamp="2026-03-21 05:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:13:21.926993452 +0000 UTC m=+1307.009577090" watchObservedRunningTime="2026-03-21 05:13:22.208994861 +0000 UTC m=+1307.291578489" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.211793 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.239762 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.259963 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.266698 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.274700 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.274814 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.274920 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.340281 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.386481 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-config-data\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.386552 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.386596 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.386636 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.386702 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c812288c-8500-4ea0-b8e8-b835bce24ac1-logs\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.386755 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c812288c-8500-4ea0-b8e8-b835bce24ac1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.386802 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-scripts\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.386839 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhzrd\" (UniqueName: \"kubernetes.io/projected/c812288c-8500-4ea0-b8e8-b835bce24ac1-kube-api-access-zhzrd\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.493004 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-scripts\") pod \"94084a3b-f944-4e21-a003-a9fee80c7248\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.493636 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnqjw\" (UniqueName: \"kubernetes.io/projected/94084a3b-f944-4e21-a003-a9fee80c7248-kube-api-access-lnqjw\") pod \"94084a3b-f944-4e21-a003-a9fee80c7248\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.494714 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-config-data\") pod \"94084a3b-f944-4e21-a003-a9fee80c7248\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.494743 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94084a3b-f944-4e21-a003-a9fee80c7248-logs\") pod \"94084a3b-f944-4e21-a003-a9fee80c7248\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.494838 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94084a3b-f944-4e21-a003-a9fee80c7248-httpd-run\") pod \"94084a3b-f944-4e21-a003-a9fee80c7248\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.494920 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-combined-ca-bundle\") pod \"94084a3b-f944-4e21-a003-a9fee80c7248\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.494949 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"94084a3b-f944-4e21-a003-a9fee80c7248\" (UID: \"94084a3b-f944-4e21-a003-a9fee80c7248\") " Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.495176 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-config-data\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.495220 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.495258 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.495299 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.495351 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c812288c-8500-4ea0-b8e8-b835bce24ac1-logs\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.495392 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c812288c-8500-4ea0-b8e8-b835bce24ac1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.495420 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-scripts\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.495461 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhzrd\" (UniqueName: \"kubernetes.io/projected/c812288c-8500-4ea0-b8e8-b835bce24ac1-kube-api-access-zhzrd\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.497529 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94084a3b-f944-4e21-a003-a9fee80c7248-logs" (OuterVolumeSpecName: "logs") pod "94084a3b-f944-4e21-a003-a9fee80c7248" (UID: "94084a3b-f944-4e21-a003-a9fee80c7248"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.499555 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c812288c-8500-4ea0-b8e8-b835bce24ac1-logs\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.499983 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.500242 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c812288c-8500-4ea0-b8e8-b835bce24ac1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.500508 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94084a3b-f944-4e21-a003-a9fee80c7248-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "94084a3b-f944-4e21-a003-a9fee80c7248" (UID: "94084a3b-f944-4e21-a003-a9fee80c7248"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.511133 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.519674 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-scripts" (OuterVolumeSpecName: "scripts") pod "94084a3b-f944-4e21-a003-a9fee80c7248" (UID: "94084a3b-f944-4e21-a003-a9fee80c7248"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.519885 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "94084a3b-f944-4e21-a003-a9fee80c7248" (UID: "94084a3b-f944-4e21-a003-a9fee80c7248"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.519976 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhzrd\" (UniqueName: \"kubernetes.io/projected/c812288c-8500-4ea0-b8e8-b835bce24ac1-kube-api-access-zhzrd\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.521514 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-scripts\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.567896 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.570931 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-config-data\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.571447 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94084a3b-f944-4e21-a003-a9fee80c7248-kube-api-access-lnqjw" (OuterVolumeSpecName: "kube-api-access-lnqjw") pod "94084a3b-f944-4e21-a003-a9fee80c7248" (UID: "94084a3b-f944-4e21-a003-a9fee80c7248"). InnerVolumeSpecName "kube-api-access-lnqjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.572258 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94084a3b-f944-4e21-a003-a9fee80c7248" (UID: "94084a3b-f944-4e21-a003-a9fee80c7248"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.591597 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.599002 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94084a3b-f944-4e21-a003-a9fee80c7248-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.599032 4580 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94084a3b-f944-4e21-a003-a9fee80c7248-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.599044 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.599070 4580 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.599081 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.599090 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnqjw\" (UniqueName: \"kubernetes.io/projected/94084a3b-f944-4e21-a003-a9fee80c7248-kube-api-access-lnqjw\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.641122 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.711683 4580 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.746633 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-config-data" (OuterVolumeSpecName: "config-data") pod "94084a3b-f944-4e21-a003-a9fee80c7248" (UID: "94084a3b-f944-4e21-a003-a9fee80c7248"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.802871 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94084a3b-f944-4e21-a003-a9fee80c7248-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.802911 4580 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.913442 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d45658b5d-dfjj4"] Mar 21 05:13:22 crc kubenswrapper[4580]: W0321 05:13:22.957956 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19491f31_c899_4d84_a81b_262d0660b2c1.slice/crio-3b47ece6ecb00e7cfb73a493458ddac391025921aa96471f62c04f793a930e4e WatchSource:0}: Error finding container 3b47ece6ecb00e7cfb73a493458ddac391025921aa96471f62c04f793a930e4e: Status 404 returned error can't find the container with id 3b47ece6ecb00e7cfb73a493458ddac391025921aa96471f62c04f793a930e4e Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.960890 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.960950 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"94084a3b-f944-4e21-a003-a9fee80c7248","Type":"ContainerDied","Data":"9ebfac552a9bd47a710304fadd2dece44d1f72c20bef12a277b8c9a9a0707a43"} Mar 21 05:13:22 crc kubenswrapper[4580]: I0321 05:13:22.961044 4580 scope.go:117] "RemoveContainer" containerID="4a0ed2e5f8b3f5e4c5a389cd8ec9df93da3e9997523273906c66c8856504acff" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.026497 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.050335 4580 scope.go:117] "RemoveContainer" containerID="377df9769c8e88363355633f472d9098242ec6c99c843f0d3e4be346e403468e" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.061911 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.089979 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:13:23 crc kubenswrapper[4580]: E0321 05:13:23.090491 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94084a3b-f944-4e21-a003-a9fee80c7248" containerName="glance-log" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.090507 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="94084a3b-f944-4e21-a003-a9fee80c7248" containerName="glance-log" Mar 21 05:13:23 crc kubenswrapper[4580]: E0321 05:13:23.090529 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94084a3b-f944-4e21-a003-a9fee80c7248" containerName="glance-httpd" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.090537 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="94084a3b-f944-4e21-a003-a9fee80c7248" containerName="glance-httpd" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.090733 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="94084a3b-f944-4e21-a003-a9fee80c7248" containerName="glance-log" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.090753 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="94084a3b-f944-4e21-a003-a9fee80c7248" containerName="glance-httpd" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.091875 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.101217 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.101335 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.101745 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.221744 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.222639 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx785\" (UniqueName: \"kubernetes.io/projected/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-kube-api-access-cx785\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.222798 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.222861 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.222905 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.222959 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.223892 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-logs\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.224051 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.326349 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.326435 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx785\" (UniqueName: \"kubernetes.io/projected/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-kube-api-access-cx785\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.326478 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.326524 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.326551 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.326590 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.326748 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-logs\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.326775 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.327565 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.327961 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-logs\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.328546 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.332479 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.343649 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.344490 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.343945 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.354622 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx785\" (UniqueName: \"kubernetes.io/projected/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-kube-api-access-cx785\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.365869 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.423023 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.426034 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.645681 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5" path="/var/lib/kubelet/pods/7b5f8eca-74bd-448b-9b2a-fe5aa661a3f5/volumes" Mar 21 05:13:23 crc kubenswrapper[4580]: I0321 05:13:23.646710 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94084a3b-f944-4e21-a003-a9fee80c7248" path="/var/lib/kubelet/pods/94084a3b-f944-4e21-a003-a9fee80c7248/volumes" Mar 21 05:13:24 crc kubenswrapper[4580]: I0321 05:13:24.000260 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c812288c-8500-4ea0-b8e8-b835bce24ac1","Type":"ContainerStarted","Data":"10e6ef65272e29c7629603520473bc0c71e68674079d4ee2c2e8b5e0f40dd651"} Mar 21 05:13:24 crc kubenswrapper[4580]: I0321 05:13:24.019940 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d45658b5d-dfjj4" event={"ID":"19491f31-c899-4d84-a81b-262d0660b2c1","Type":"ContainerStarted","Data":"2e840e698551f2a96ce6a42337dfd4d349937a55291f0a20f035668cf23f9148"} Mar 21 05:13:24 crc kubenswrapper[4580]: I0321 05:13:24.019982 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d45658b5d-dfjj4" event={"ID":"19491f31-c899-4d84-a81b-262d0660b2c1","Type":"ContainerStarted","Data":"3b47ece6ecb00e7cfb73a493458ddac391025921aa96471f62c04f793a930e4e"} Mar 21 05:13:24 crc kubenswrapper[4580]: I0321 05:13:24.020023 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:13:24 crc kubenswrapper[4580]: I0321 05:13:24.079918 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6d45658b5d-dfjj4" podStartSLOduration=3.079895856 podStartE2EDuration="3.079895856s" podCreationTimestamp="2026-03-21 05:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:13:24.073033854 +0000 UTC m=+1309.155617482" watchObservedRunningTime="2026-03-21 05:13:24.079895856 +0000 UTC m=+1309.162479484" Mar 21 05:13:24 crc kubenswrapper[4580]: I0321 05:13:24.355285 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:13:24 crc kubenswrapper[4580]: W0321 05:13:24.368141 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9823fd6_1f6b_4d03_aa58_f4fdba8e7f46.slice/crio-a73426c1d2aab450042fe0cbc7f723d4988ec8230ad0037303a521689d28ba21 WatchSource:0}: Error finding container a73426c1d2aab450042fe0cbc7f723d4988ec8230ad0037303a521689d28ba21: Status 404 returned error can't find the container with id a73426c1d2aab450042fe0cbc7f723d4988ec8230ad0037303a521689d28ba21 Mar 21 05:13:25 crc kubenswrapper[4580]: I0321 05:13:25.051912 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c812288c-8500-4ea0-b8e8-b835bce24ac1","Type":"ContainerStarted","Data":"cc0760a3da7aab4698e56cd44ccdd1b7aab1a15cc610061aa17a82b49df8264d"} Mar 21 05:13:25 crc kubenswrapper[4580]: I0321 05:13:25.073380 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46","Type":"ContainerStarted","Data":"a73426c1d2aab450042fe0cbc7f723d4988ec8230ad0037303a521689d28ba21"} Mar 21 05:13:26 crc kubenswrapper[4580]: I0321 05:13:26.083646 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c812288c-8500-4ea0-b8e8-b835bce24ac1","Type":"ContainerStarted","Data":"323da36ba85fcdd25d911164fac79635954bffb8abe58043bddadf33f713b412"} Mar 21 05:13:26 crc kubenswrapper[4580]: I0321 05:13:26.085148 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46","Type":"ContainerStarted","Data":"1611d88ed1574f196250e15c569749879849b59f34652e475b3b7008de9a4f79"} Mar 21 05:13:26 crc kubenswrapper[4580]: I0321 05:13:26.108146 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.108127136 podStartE2EDuration="4.108127136s" podCreationTimestamp="2026-03-21 05:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:13:26.101899481 +0000 UTC m=+1311.184483119" watchObservedRunningTime="2026-03-21 05:13:26.108127136 +0000 UTC m=+1311.190710764" Mar 21 05:13:27 crc kubenswrapper[4580]: I0321 05:13:27.095022 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46","Type":"ContainerStarted","Data":"8eed586d42a9da62f78732e2bf8bcfd250c213f5e900feb05784cb50f738502b"} Mar 21 05:13:27 crc kubenswrapper[4580]: I0321 05:13:27.123996 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.123976137 podStartE2EDuration="4.123976137s" podCreationTimestamp="2026-03-21 05:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:13:27.114582577 +0000 UTC m=+1312.197166205" watchObservedRunningTime="2026-03-21 05:13:27.123976137 +0000 UTC m=+1312.206559755" Mar 21 05:13:27 crc kubenswrapper[4580]: I0321 05:13:27.683976 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:27 crc kubenswrapper[4580]: I0321 05:13:27.778256 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-5tp6t"] Mar 21 05:13:27 crc kubenswrapper[4580]: I0321 05:13:27.778661 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" podUID="cf72524d-be2d-4051-a966-4d0cbfb2523e" containerName="dnsmasq-dns" containerID="cri-o://d61cd1fa45b0e5a3d303367f2ba6300dc6aa59e59a82a07987f668ee101ff12f" gracePeriod=10 Mar 21 05:13:28 crc kubenswrapper[4580]: I0321 05:13:28.106631 4580 generic.go:334] "Generic (PLEG): container finished" podID="3bfbab08-aee7-43bf-9118-252682438c95" containerID="cf1fbd9cf540e60eee221e084401254071b0f33e589ff7e0ba9e0bd010cc35ad" exitCode=0 Mar 21 05:13:28 crc kubenswrapper[4580]: I0321 05:13:28.107008 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f2ksl" event={"ID":"3bfbab08-aee7-43bf-9118-252682438c95","Type":"ContainerDied","Data":"cf1fbd9cf540e60eee221e084401254071b0f33e589ff7e0ba9e0bd010cc35ad"} Mar 21 05:13:28 crc kubenswrapper[4580]: I0321 05:13:28.121958 4580 generic.go:334] "Generic (PLEG): container finished" podID="cf72524d-be2d-4051-a966-4d0cbfb2523e" containerID="d61cd1fa45b0e5a3d303367f2ba6300dc6aa59e59a82a07987f668ee101ff12f" exitCode=0 Mar 21 05:13:28 crc kubenswrapper[4580]: I0321 05:13:28.122930 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" event={"ID":"cf72524d-be2d-4051-a966-4d0cbfb2523e","Type":"ContainerDied","Data":"d61cd1fa45b0e5a3d303367f2ba6300dc6aa59e59a82a07987f668ee101ff12f"} Mar 21 05:13:31 crc kubenswrapper[4580]: I0321 05:13:31.394827 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:13:31 crc kubenswrapper[4580]: I0321 05:13:31.507316 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 21 05:13:32 crc kubenswrapper[4580]: I0321 05:13:32.162160 4580 generic.go:334] "Generic (PLEG): container finished" podID="55568564-a701-4c16-b5c4-617f88c364a5" containerID="74309c1e8a8be97da322068060f5c3c7724a07d0be54c0e9b9f1e700809e5b3b" exitCode=0 Mar 21 05:13:32 crc kubenswrapper[4580]: I0321 05:13:32.162334 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m7rwg" event={"ID":"55568564-a701-4c16-b5c4-617f88c364a5","Type":"ContainerDied","Data":"74309c1e8a8be97da322068060f5c3c7724a07d0be54c0e9b9f1e700809e5b3b"} Mar 21 05:13:32 crc kubenswrapper[4580]: I0321 05:13:32.168514 4580 generic.go:334] "Generic (PLEG): container finished" podID="b1916415-d4eb-4dbd-bccb-ac932a09843c" containerID="5271d2e1c3d8b200abfe020ec89602d597e379128fb07749c2f006b86f466527" exitCode=0 Mar 21 05:13:32 crc kubenswrapper[4580]: I0321 05:13:32.168557 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tb6cr" event={"ID":"b1916415-d4eb-4dbd-bccb-ac932a09843c","Type":"ContainerDied","Data":"5271d2e1c3d8b200abfe020ec89602d597e379128fb07749c2f006b86f466527"} Mar 21 05:13:32 crc kubenswrapper[4580]: I0321 05:13:32.642962 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 05:13:32 crc kubenswrapper[4580]: I0321 05:13:32.643027 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 05:13:32 crc kubenswrapper[4580]: I0321 05:13:32.693317 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 05:13:32 crc kubenswrapper[4580]: I0321 05:13:32.707179 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.131824 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.137369 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f2ksl" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.200013 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" event={"ID":"cf72524d-be2d-4051-a966-4d0cbfb2523e","Type":"ContainerDied","Data":"8ba69dfde9e2ad60f55f82c17533eaa48587760d235be8fe93071a1c3be66131"} Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.200103 4580 scope.go:117] "RemoveContainer" containerID="d61cd1fa45b0e5a3d303367f2ba6300dc6aa59e59a82a07987f668ee101ff12f" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.200336 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.230953 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f2ksl" event={"ID":"3bfbab08-aee7-43bf-9118-252682438c95","Type":"ContainerDied","Data":"62c79f8b2bded3641cf7a2d8bfc6f0792c50545e9d4d4426b00640accdd01c86"} Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.231007 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62c79f8b2bded3641cf7a2d8bfc6f0792c50545e9d4d4426b00640accdd01c86" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.231122 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f2ksl" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.232619 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.232680 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.278798 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3bfbab08-aee7-43bf-9118-252682438c95-db-sync-config-data\") pod \"3bfbab08-aee7-43bf-9118-252682438c95\" (UID: \"3bfbab08-aee7-43bf-9118-252682438c95\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.278906 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-dns-swift-storage-0\") pod \"cf72524d-be2d-4051-a966-4d0cbfb2523e\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.278970 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvz6p\" (UniqueName: \"kubernetes.io/projected/cf72524d-be2d-4051-a966-4d0cbfb2523e-kube-api-access-nvz6p\") pod \"cf72524d-be2d-4051-a966-4d0cbfb2523e\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.278992 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-ovsdbserver-sb\") pod \"cf72524d-be2d-4051-a966-4d0cbfb2523e\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.279062 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-config\") pod \"cf72524d-be2d-4051-a966-4d0cbfb2523e\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.279100 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9cdd\" (UniqueName: \"kubernetes.io/projected/3bfbab08-aee7-43bf-9118-252682438c95-kube-api-access-d9cdd\") pod \"3bfbab08-aee7-43bf-9118-252682438c95\" (UID: \"3bfbab08-aee7-43bf-9118-252682438c95\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.279149 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-ovsdbserver-nb\") pod \"cf72524d-be2d-4051-a966-4d0cbfb2523e\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.279243 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-dns-svc\") pod \"cf72524d-be2d-4051-a966-4d0cbfb2523e\" (UID: \"cf72524d-be2d-4051-a966-4d0cbfb2523e\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.279288 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfbab08-aee7-43bf-9118-252682438c95-combined-ca-bundle\") pod \"3bfbab08-aee7-43bf-9118-252682438c95\" (UID: \"3bfbab08-aee7-43bf-9118-252682438c95\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.288671 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bfbab08-aee7-43bf-9118-252682438c95-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3bfbab08-aee7-43bf-9118-252682438c95" (UID: "3bfbab08-aee7-43bf-9118-252682438c95"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.296384 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf72524d-be2d-4051-a966-4d0cbfb2523e-kube-api-access-nvz6p" (OuterVolumeSpecName: "kube-api-access-nvz6p") pod "cf72524d-be2d-4051-a966-4d0cbfb2523e" (UID: "cf72524d-be2d-4051-a966-4d0cbfb2523e"). InnerVolumeSpecName "kube-api-access-nvz6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.297159 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bfbab08-aee7-43bf-9118-252682438c95-kube-api-access-d9cdd" (OuterVolumeSpecName: "kube-api-access-d9cdd") pod "3bfbab08-aee7-43bf-9118-252682438c95" (UID: "3bfbab08-aee7-43bf-9118-252682438c95"). InnerVolumeSpecName "kube-api-access-d9cdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.344268 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bfbab08-aee7-43bf-9118-252682438c95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bfbab08-aee7-43bf-9118-252682438c95" (UID: "3bfbab08-aee7-43bf-9118-252682438c95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.363602 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf72524d-be2d-4051-a966-4d0cbfb2523e" (UID: "cf72524d-be2d-4051-a966-4d0cbfb2523e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.366746 4580 scope.go:117] "RemoveContainer" containerID="f714f7b375bf2479c501f8fadaefd888ccca1a257ad4775c47ca7d73f68007d5" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.367212 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf72524d-be2d-4051-a966-4d0cbfb2523e" (UID: "cf72524d-be2d-4051-a966-4d0cbfb2523e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.383238 4580 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.383265 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfbab08-aee7-43bf-9118-252682438c95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.383276 4580 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3bfbab08-aee7-43bf-9118-252682438c95-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.383284 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvz6p\" (UniqueName: \"kubernetes.io/projected/cf72524d-be2d-4051-a966-4d0cbfb2523e-kube-api-access-nvz6p\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.383293 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9cdd\" (UniqueName: \"kubernetes.io/projected/3bfbab08-aee7-43bf-9118-252682438c95-kube-api-access-d9cdd\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.383301 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.404944 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-config" (OuterVolumeSpecName: "config") pod "cf72524d-be2d-4051-a966-4d0cbfb2523e" (UID: "cf72524d-be2d-4051-a966-4d0cbfb2523e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.433379 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.434456 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.435916 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cf72524d-be2d-4051-a966-4d0cbfb2523e" (UID: "cf72524d-be2d-4051-a966-4d0cbfb2523e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.481929 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf72524d-be2d-4051-a966-4d0cbfb2523e" (UID: "cf72524d-be2d-4051-a966-4d0cbfb2523e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.494309 4580 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.494350 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.494363 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf72524d-be2d-4051-a966-4d0cbfb2523e-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.494435 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.542404 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.668984 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-5tp6t"] Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.700447 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-5tp6t"] Mar 21 05:13:33 crc kubenswrapper[4580]: E0321 05:13:33.748150 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="c311e091-7cf1-426b-9788-a3d64b198e43" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.791719 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tb6cr" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.804655 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.900499 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-combined-ca-bundle\") pod \"55568564-a701-4c16-b5c4-617f88c364a5\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.900587 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55568564-a701-4c16-b5c4-617f88c364a5-etc-machine-id\") pod \"55568564-a701-4c16-b5c4-617f88c364a5\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.900656 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-scripts\") pod \"55568564-a701-4c16-b5c4-617f88c364a5\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.900719 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1916415-d4eb-4dbd-bccb-ac932a09843c-config\") pod \"b1916415-d4eb-4dbd-bccb-ac932a09843c\" (UID: \"b1916415-d4eb-4dbd-bccb-ac932a09843c\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.900737 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1916415-d4eb-4dbd-bccb-ac932a09843c-combined-ca-bundle\") pod \"b1916415-d4eb-4dbd-bccb-ac932a09843c\" (UID: \"b1916415-d4eb-4dbd-bccb-ac932a09843c\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.900963 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6q57\" (UniqueName: \"kubernetes.io/projected/55568564-a701-4c16-b5c4-617f88c364a5-kube-api-access-j6q57\") pod \"55568564-a701-4c16-b5c4-617f88c364a5\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.900999 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-db-sync-config-data\") pod \"55568564-a701-4c16-b5c4-617f88c364a5\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.901029 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk4vn\" (UniqueName: \"kubernetes.io/projected/b1916415-d4eb-4dbd-bccb-ac932a09843c-kube-api-access-dk4vn\") pod \"b1916415-d4eb-4dbd-bccb-ac932a09843c\" (UID: \"b1916415-d4eb-4dbd-bccb-ac932a09843c\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.901110 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-config-data\") pod \"55568564-a701-4c16-b5c4-617f88c364a5\" (UID: \"55568564-a701-4c16-b5c4-617f88c364a5\") " Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.900987 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55568564-a701-4c16-b5c4-617f88c364a5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "55568564-a701-4c16-b5c4-617f88c364a5" (UID: "55568564-a701-4c16-b5c4-617f88c364a5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.902016 4580 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55568564-a701-4c16-b5c4-617f88c364a5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.913311 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "55568564-a701-4c16-b5c4-617f88c364a5" (UID: "55568564-a701-4c16-b5c4-617f88c364a5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.914610 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-scripts" (OuterVolumeSpecName: "scripts") pod "55568564-a701-4c16-b5c4-617f88c364a5" (UID: "55568564-a701-4c16-b5c4-617f88c364a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.914661 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55568564-a701-4c16-b5c4-617f88c364a5-kube-api-access-j6q57" (OuterVolumeSpecName: "kube-api-access-j6q57") pod "55568564-a701-4c16-b5c4-617f88c364a5" (UID: "55568564-a701-4c16-b5c4-617f88c364a5"). InnerVolumeSpecName "kube-api-access-j6q57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.917385 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1916415-d4eb-4dbd-bccb-ac932a09843c-kube-api-access-dk4vn" (OuterVolumeSpecName: "kube-api-access-dk4vn") pod "b1916415-d4eb-4dbd-bccb-ac932a09843c" (UID: "b1916415-d4eb-4dbd-bccb-ac932a09843c"). InnerVolumeSpecName "kube-api-access-dk4vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.935878 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55568564-a701-4c16-b5c4-617f88c364a5" (UID: "55568564-a701-4c16-b5c4-617f88c364a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.938652 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1916415-d4eb-4dbd-bccb-ac932a09843c-config" (OuterVolumeSpecName: "config") pod "b1916415-d4eb-4dbd-bccb-ac932a09843c" (UID: "b1916415-d4eb-4dbd-bccb-ac932a09843c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.938865 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1916415-d4eb-4dbd-bccb-ac932a09843c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1916415-d4eb-4dbd-bccb-ac932a09843c" (UID: "b1916415-d4eb-4dbd-bccb-ac932a09843c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:33 crc kubenswrapper[4580]: I0321 05:13:33.949137 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-config-data" (OuterVolumeSpecName: "config-data") pod "55568564-a701-4c16-b5c4-617f88c364a5" (UID: "55568564-a701-4c16-b5c4-617f88c364a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.006469 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1916415-d4eb-4dbd-bccb-ac932a09843c-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.006502 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1916415-d4eb-4dbd-bccb-ac932a09843c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.006518 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6q57\" (UniqueName: \"kubernetes.io/projected/55568564-a701-4c16-b5c4-617f88c364a5-kube-api-access-j6q57\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.006531 4580 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.006543 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk4vn\" (UniqueName: \"kubernetes.io/projected/b1916415-d4eb-4dbd-bccb-ac932a09843c-kube-api-access-dk4vn\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.006553 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.006564 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.006574 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55568564-a701-4c16-b5c4-617f88c364a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.241184 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tb6cr" event={"ID":"b1916415-d4eb-4dbd-bccb-ac932a09843c","Type":"ContainerDied","Data":"38061da3297295905c0818b842451978ca28a011c31eab67453c6d62cc4ccbf2"} Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.241240 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38061da3297295905c0818b842451978ca28a011c31eab67453c6d62cc4ccbf2" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.241324 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tb6cr" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.245195 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m7rwg" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.245189 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m7rwg" event={"ID":"55568564-a701-4c16-b5c4-617f88c364a5","Type":"ContainerDied","Data":"50ffd984b7e6da06c3d33d9f673c1055627920b7b2eb422080c870a96685400d"} Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.245308 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50ffd984b7e6da06c3d33d9f673c1055627920b7b2eb422080c870a96685400d" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.248992 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c311e091-7cf1-426b-9788-a3d64b198e43","Type":"ContainerStarted","Data":"ea1247a808b934e0c8f07cf7b669ca9c2f7c2f17267d5c74df208caeb313ea12"} Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.249479 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.249511 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.249623 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c311e091-7cf1-426b-9788-a3d64b198e43" containerName="ceilometer-notification-agent" containerID="cri-o://2ee092012d167f5b248abd488e907d058d7e1ad76a3a5daf34599a61962fa2b6" gracePeriod=30 Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.249652 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c311e091-7cf1-426b-9788-a3d64b198e43" containerName="proxy-httpd" containerID="cri-o://ea1247a808b934e0c8f07cf7b669ca9c2f7c2f17267d5c74df208caeb313ea12" gracePeriod=30 Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.249719 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c311e091-7cf1-426b-9788-a3d64b198e43" containerName="sg-core" containerID="cri-o://79b346aa629f5e0c52009561345e629f878a979fb3866889dc45c60a0a077a68" gracePeriod=30 Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.578703 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-ml5pw"] Mar 21 05:13:34 crc kubenswrapper[4580]: E0321 05:13:34.579062 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55568564-a701-4c16-b5c4-617f88c364a5" containerName="cinder-db-sync" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.579075 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="55568564-a701-4c16-b5c4-617f88c364a5" containerName="cinder-db-sync" Mar 21 05:13:34 crc kubenswrapper[4580]: E0321 05:13:34.579094 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1916415-d4eb-4dbd-bccb-ac932a09843c" containerName="neutron-db-sync" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.579100 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1916415-d4eb-4dbd-bccb-ac932a09843c" containerName="neutron-db-sync" Mar 21 05:13:34 crc kubenswrapper[4580]: E0321 05:13:34.579110 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bfbab08-aee7-43bf-9118-252682438c95" containerName="barbican-db-sync" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.579127 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bfbab08-aee7-43bf-9118-252682438c95" containerName="barbican-db-sync" Mar 21 05:13:34 crc kubenswrapper[4580]: E0321 05:13:34.579143 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf72524d-be2d-4051-a966-4d0cbfb2523e" containerName="init" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.579149 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf72524d-be2d-4051-a966-4d0cbfb2523e" containerName="init" Mar 21 05:13:34 crc kubenswrapper[4580]: E0321 05:13:34.579157 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf72524d-be2d-4051-a966-4d0cbfb2523e" containerName="dnsmasq-dns" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.579163 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf72524d-be2d-4051-a966-4d0cbfb2523e" containerName="dnsmasq-dns" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.579310 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bfbab08-aee7-43bf-9118-252682438c95" containerName="barbican-db-sync" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.579348 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="55568564-a701-4c16-b5c4-617f88c364a5" containerName="cinder-db-sync" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.579356 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf72524d-be2d-4051-a966-4d0cbfb2523e" containerName="dnsmasq-dns" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.579365 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1916415-d4eb-4dbd-bccb-ac932a09843c" containerName="neutron-db-sync" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.580673 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.659455 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.675585 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.676814 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.677138 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-dns-svc\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.677314 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmspb\" (UniqueName: \"kubernetes.io/projected/a2cc638c-f6d4-4016-b100-9d327b65065c-kube-api-access-fmspb\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.677541 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-config\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.661079 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-ml5pw"] Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.780739 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmspb\" (UniqueName: \"kubernetes.io/projected/a2cc638c-f6d4-4016-b100-9d327b65065c-kube-api-access-fmspb\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.784004 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-config\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.784094 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.784165 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.784189 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.784399 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-dns-svc\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.786128 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.787703 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-config\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.789036 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.790673 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.791018 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-dns-svc\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.812121 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-57d468d6b8-4xwvd"] Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.813637 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.833152 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.833614 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.833887 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2ctzv" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.833923 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.856730 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57d468d6b8-4xwvd"] Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.886018 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-httpd-config\") pod \"neutron-57d468d6b8-4xwvd\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.886316 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-ovndb-tls-certs\") pod \"neutron-57d468d6b8-4xwvd\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.886491 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-config\") pod \"neutron-57d468d6b8-4xwvd\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.886973 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9qgj\" (UniqueName: \"kubernetes.io/projected/33984e31-23ff-4d28-9828-74d12b7fc0a7-kube-api-access-g9qgj\") pod \"neutron-57d468d6b8-4xwvd\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.887151 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-combined-ca-bundle\") pod \"neutron-57d468d6b8-4xwvd\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.887349 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c"] Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.891199 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.909840 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.911604 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.914471 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.914658 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tlsfx" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.916037 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmspb\" (UniqueName: \"kubernetes.io/projected/a2cc638c-f6d4-4016-b100-9d327b65065c-kube-api-access-fmspb\") pod \"dnsmasq-dns-55f844cf75-ml5pw\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.935640 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.964825 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-npsxx" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.965018 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.965908 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.965937 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.966861 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-787f545779-9db4b"] Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.972496 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.982826 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991429 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4b4bf3-0508-4021-916b-97694fe670ff-combined-ca-bundle\") pod \"barbican-keystone-listener-7cfb6cbb9d-ln66c\" (UID: \"0f4b4bf3-0508-4021-916b-97694fe670ff\") " pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991488 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83350763-3294-4c10-8bf0-531aec2e110f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991520 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9qgj\" (UniqueName: \"kubernetes.io/projected/33984e31-23ff-4d28-9828-74d12b7fc0a7-kube-api-access-g9qgj\") pod \"neutron-57d468d6b8-4xwvd\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991564 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991581 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-combined-ca-bundle\") pod \"neutron-57d468d6b8-4xwvd\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991597 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f4b4bf3-0508-4021-916b-97694fe670ff-config-data-custom\") pod \"barbican-keystone-listener-7cfb6cbb9d-ln66c\" (UID: \"0f4b4bf3-0508-4021-916b-97694fe670ff\") " pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991619 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grp5n\" (UniqueName: \"kubernetes.io/projected/83350763-3294-4c10-8bf0-531aec2e110f-kube-api-access-grp5n\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991648 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f4b4bf3-0508-4021-916b-97694fe670ff-logs\") pod \"barbican-keystone-listener-7cfb6cbb9d-ln66c\" (UID: \"0f4b4bf3-0508-4021-916b-97694fe670ff\") " pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991672 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-httpd-config\") pod \"neutron-57d468d6b8-4xwvd\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991690 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f4b4bf3-0508-4021-916b-97694fe670ff-config-data\") pod \"barbican-keystone-listener-7cfb6cbb9d-ln66c\" (UID: \"0f4b4bf3-0508-4021-916b-97694fe670ff\") " pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991709 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991728 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwzjx\" (UniqueName: \"kubernetes.io/projected/0f4b4bf3-0508-4021-916b-97694fe670ff-kube-api-access-dwzjx\") pod \"barbican-keystone-listener-7cfb6cbb9d-ln66c\" (UID: \"0f4b4bf3-0508-4021-916b-97694fe670ff\") " pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991768 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-ovndb-tls-certs\") pod \"neutron-57d468d6b8-4xwvd\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991838 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-scripts\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991854 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-config-data\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:34 crc kubenswrapper[4580]: I0321 05:13:34.991877 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-config\") pod \"neutron-57d468d6b8-4xwvd\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.013609 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.022087 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-httpd-config\") pod \"neutron-57d468d6b8-4xwvd\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.027369 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c"] Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.033945 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-combined-ca-bundle\") pod \"neutron-57d468d6b8-4xwvd\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.034743 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-config\") pod \"neutron-57d468d6b8-4xwvd\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.039143 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-ovndb-tls-certs\") pod \"neutron-57d468d6b8-4xwvd\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093091 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d79bd04a-35d0-48ab-883f-982e3129d435-combined-ca-bundle\") pod \"barbican-worker-787f545779-9db4b\" (UID: \"d79bd04a-35d0-48ab-883f-982e3129d435\") " pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093144 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-scripts\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093170 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-config-data\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093206 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4b4bf3-0508-4021-916b-97694fe670ff-combined-ca-bundle\") pod \"barbican-keystone-listener-7cfb6cbb9d-ln66c\" (UID: \"0f4b4bf3-0508-4021-916b-97694fe670ff\") " pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093236 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83350763-3294-4c10-8bf0-531aec2e110f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093253 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d79bd04a-35d0-48ab-883f-982e3129d435-logs\") pod \"barbican-worker-787f545779-9db4b\" (UID: \"d79bd04a-35d0-48ab-883f-982e3129d435\") " pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093290 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d79bd04a-35d0-48ab-883f-982e3129d435-config-data-custom\") pod \"barbican-worker-787f545779-9db4b\" (UID: \"d79bd04a-35d0-48ab-883f-982e3129d435\") " pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093334 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093354 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f4b4bf3-0508-4021-916b-97694fe670ff-config-data-custom\") pod \"barbican-keystone-listener-7cfb6cbb9d-ln66c\" (UID: \"0f4b4bf3-0508-4021-916b-97694fe670ff\") " pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093378 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grp5n\" (UniqueName: \"kubernetes.io/projected/83350763-3294-4c10-8bf0-531aec2e110f-kube-api-access-grp5n\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093404 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f4b4bf3-0508-4021-916b-97694fe670ff-logs\") pod \"barbican-keystone-listener-7cfb6cbb9d-ln66c\" (UID: \"0f4b4bf3-0508-4021-916b-97694fe670ff\") " pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093426 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f4b4bf3-0508-4021-916b-97694fe670ff-config-data\") pod \"barbican-keystone-listener-7cfb6cbb9d-ln66c\" (UID: \"0f4b4bf3-0508-4021-916b-97694fe670ff\") " pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093452 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093475 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwzjx\" (UniqueName: \"kubernetes.io/projected/0f4b4bf3-0508-4021-916b-97694fe670ff-kube-api-access-dwzjx\") pod \"barbican-keystone-listener-7cfb6cbb9d-ln66c\" (UID: \"0f4b4bf3-0508-4021-916b-97694fe670ff\") " pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093495 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpm86\" (UniqueName: \"kubernetes.io/projected/d79bd04a-35d0-48ab-883f-982e3129d435-kube-api-access-wpm86\") pod \"barbican-worker-787f545779-9db4b\" (UID: \"d79bd04a-35d0-48ab-883f-982e3129d435\") " pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.093516 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d79bd04a-35d0-48ab-883f-982e3129d435-config-data\") pod \"barbican-worker-787f545779-9db4b\" (UID: \"d79bd04a-35d0-48ab-883f-982e3129d435\") " pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.105924 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f4b4bf3-0508-4021-916b-97694fe670ff-config-data-custom\") pod \"barbican-keystone-listener-7cfb6cbb9d-ln66c\" (UID: \"0f4b4bf3-0508-4021-916b-97694fe670ff\") " pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.108692 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83350763-3294-4c10-8bf0-531aec2e110f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.109345 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f4b4bf3-0508-4021-916b-97694fe670ff-logs\") pod \"barbican-keystone-listener-7cfb6cbb9d-ln66c\" (UID: \"0f4b4bf3-0508-4021-916b-97694fe670ff\") " pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.113595 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.115205 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-config-data\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.116115 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f4b4bf3-0508-4021-916b-97694fe670ff-config-data\") pod \"barbican-keystone-listener-7cfb6cbb9d-ln66c\" (UID: \"0f4b4bf3-0508-4021-916b-97694fe670ff\") " pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.116791 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-scripts\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.117310 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.118768 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4b4bf3-0508-4021-916b-97694fe670ff-combined-ca-bundle\") pod \"barbican-keystone-listener-7cfb6cbb9d-ln66c\" (UID: \"0f4b4bf3-0508-4021-916b-97694fe670ff\") " pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.128059 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.138218 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9qgj\" (UniqueName: \"kubernetes.io/projected/33984e31-23ff-4d28-9828-74d12b7fc0a7-kube-api-access-g9qgj\") pod \"neutron-57d468d6b8-4xwvd\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.179381 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-787f545779-9db4b"] Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.180169 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.194692 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpm86\" (UniqueName: \"kubernetes.io/projected/d79bd04a-35d0-48ab-883f-982e3129d435-kube-api-access-wpm86\") pod \"barbican-worker-787f545779-9db4b\" (UID: \"d79bd04a-35d0-48ab-883f-982e3129d435\") " pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.194752 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d79bd04a-35d0-48ab-883f-982e3129d435-config-data\") pod \"barbican-worker-787f545779-9db4b\" (UID: \"d79bd04a-35d0-48ab-883f-982e3129d435\") " pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.194850 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d79bd04a-35d0-48ab-883f-982e3129d435-combined-ca-bundle\") pod \"barbican-worker-787f545779-9db4b\" (UID: \"d79bd04a-35d0-48ab-883f-982e3129d435\") " pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.194977 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d79bd04a-35d0-48ab-883f-982e3129d435-logs\") pod \"barbican-worker-787f545779-9db4b\" (UID: \"d79bd04a-35d0-48ab-883f-982e3129d435\") " pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.195019 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d79bd04a-35d0-48ab-883f-982e3129d435-config-data-custom\") pod \"barbican-worker-787f545779-9db4b\" (UID: \"d79bd04a-35d0-48ab-883f-982e3129d435\") " pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.195536 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwzjx\" (UniqueName: \"kubernetes.io/projected/0f4b4bf3-0508-4021-916b-97694fe670ff-kube-api-access-dwzjx\") pod \"barbican-keystone-listener-7cfb6cbb9d-ln66c\" (UID: \"0f4b4bf3-0508-4021-916b-97694fe670ff\") " pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.197946 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d79bd04a-35d0-48ab-883f-982e3129d435-logs\") pod \"barbican-worker-787f545779-9db4b\" (UID: \"d79bd04a-35d0-48ab-883f-982e3129d435\") " pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.211279 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grp5n\" (UniqueName: \"kubernetes.io/projected/83350763-3294-4c10-8bf0-531aec2e110f-kube-api-access-grp5n\") pod \"cinder-scheduler-0\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.217162 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d79bd04a-35d0-48ab-883f-982e3129d435-config-data-custom\") pod \"barbican-worker-787f545779-9db4b\" (UID: \"d79bd04a-35d0-48ab-883f-982e3129d435\") " pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.218393 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d79bd04a-35d0-48ab-883f-982e3129d435-config-data\") pod \"barbican-worker-787f545779-9db4b\" (UID: \"d79bd04a-35d0-48ab-883f-982e3129d435\") " pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.224534 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d79bd04a-35d0-48ab-883f-982e3129d435-combined-ca-bundle\") pod \"barbican-worker-787f545779-9db4b\" (UID: \"d79bd04a-35d0-48ab-883f-982e3129d435\") " pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.260310 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpm86\" (UniqueName: \"kubernetes.io/projected/d79bd04a-35d0-48ab-883f-982e3129d435-kube-api-access-wpm86\") pod \"barbican-worker-787f545779-9db4b\" (UID: \"d79bd04a-35d0-48ab-883f-982e3129d435\") " pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.287307 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.322839 4580 generic.go:334] "Generic (PLEG): container finished" podID="c311e091-7cf1-426b-9788-a3d64b198e43" containerID="ea1247a808b934e0c8f07cf7b669ca9c2f7c2f17267d5c74df208caeb313ea12" exitCode=0 Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.322878 4580 generic.go:334] "Generic (PLEG): container finished" podID="c311e091-7cf1-426b-9788-a3d64b198e43" containerID="79b346aa629f5e0c52009561345e629f878a979fb3866889dc45c60a0a077a68" exitCode=2 Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.322975 4580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.322985 4580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.323906 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c311e091-7cf1-426b-9788-a3d64b198e43","Type":"ContainerDied","Data":"ea1247a808b934e0c8f07cf7b669ca9c2f7c2f17267d5c74df208caeb313ea12"} Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.323941 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c311e091-7cf1-426b-9788-a3d64b198e43","Type":"ContainerDied","Data":"79b346aa629f5e0c52009561345e629f878a979fb3866889dc45c60a0a077a68"} Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.380860 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.410437 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-787f545779-9db4b" Mar 21 05:13:35 crc kubenswrapper[4580]: E0321 05:13:35.462894 4580 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc311e091_7cf1_426b_9788_a3d64b198e43.slice/crio-conmon-ea1247a808b934e0c8f07cf7b669ca9c2f7c2f17267d5c74df208caeb313ea12.scope\": RecentStats: unable to find data in memory cache]" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.493205 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-ml5pw"] Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.519931 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5bbbf4f7bb-rvhzb"] Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.521331 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.541963 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.582861 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bbbf4f7bb-rvhzb"] Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.638703 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-config-data\") pod \"barbican-api-5bbbf4f7bb-rvhzb\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.639989 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-config-data-custom\") pod \"barbican-api-5bbbf4f7bb-rvhzb\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.640163 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-combined-ca-bundle\") pod \"barbican-api-5bbbf4f7bb-rvhzb\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.640292 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpfks\" (UniqueName: \"kubernetes.io/projected/3ff1f3d4-eada-4182-9a04-48e23f84d11d-kube-api-access-qpfks\") pod \"barbican-api-5bbbf4f7bb-rvhzb\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.640398 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ff1f3d4-eada-4182-9a04-48e23f84d11d-logs\") pod \"barbican-api-5bbbf4f7bb-rvhzb\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.693245 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf72524d-be2d-4051-a966-4d0cbfb2523e" path="/var/lib/kubelet/pods/cf72524d-be2d-4051-a966-4d0cbfb2523e/volumes" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.725849 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-28d7v"] Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.740925 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.742546 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ff1f3d4-eada-4182-9a04-48e23f84d11d-logs\") pod \"barbican-api-5bbbf4f7bb-rvhzb\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.742576 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-config-data\") pod \"barbican-api-5bbbf4f7bb-rvhzb\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.742676 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-config-data-custom\") pod \"barbican-api-5bbbf4f7bb-rvhzb\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.742723 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-combined-ca-bundle\") pod \"barbican-api-5bbbf4f7bb-rvhzb\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.742743 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpfks\" (UniqueName: \"kubernetes.io/projected/3ff1f3d4-eada-4182-9a04-48e23f84d11d-kube-api-access-qpfks\") pod \"barbican-api-5bbbf4f7bb-rvhzb\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.743272 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ff1f3d4-eada-4182-9a04-48e23f84d11d-logs\") pod \"barbican-api-5bbbf4f7bb-rvhzb\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.772714 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-combined-ca-bundle\") pod \"barbican-api-5bbbf4f7bb-rvhzb\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.839420 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-config-data-custom\") pod \"barbican-api-5bbbf4f7bb-rvhzb\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.840574 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-config-data\") pod \"barbican-api-5bbbf4f7bb-rvhzb\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.893124 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpfks\" (UniqueName: \"kubernetes.io/projected/3ff1f3d4-eada-4182-9a04-48e23f84d11d-kube-api-access-qpfks\") pod \"barbican-api-5bbbf4f7bb-rvhzb\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.919210 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-28d7v"] Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.985867 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.986083 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.986153 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-config\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.986188 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.986243 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cqqt\" (UniqueName: \"kubernetes.io/projected/03e4309f-b795-4c00-8058-616430f6ea8a-kube-api-access-9cqqt\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.986285 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:35 crc kubenswrapper[4580]: I0321 05:13:35.989615 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.107182 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cqqt\" (UniqueName: \"kubernetes.io/projected/03e4309f-b795-4c00-8058-616430f6ea8a-kube-api-access-9cqqt\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.107544 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.107661 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.107886 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.107987 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-config\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.108057 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.112428 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.114256 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.121587 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.152027 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.153530 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.154257 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.155973 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.158097 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-config\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.164705 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.189577 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cqqt\" (UniqueName: \"kubernetes.io/projected/03e4309f-b795-4c00-8058-616430f6ea8a-kube-api-access-9cqqt\") pod \"dnsmasq-dns-5c9776ccc5-28d7v\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.203234 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.212736 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrvxl\" (UniqueName: \"kubernetes.io/projected/d8267ff8-754f-446f-a7b1-94462f0f9c93-kube-api-access-xrvxl\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.212827 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.212874 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-scripts\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.212908 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8267ff8-754f-446f-a7b1-94462f0f9c93-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.212950 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8267ff8-754f-446f-a7b1-94462f0f9c93-logs\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.212994 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-config-data-custom\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.213022 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-config-data\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.315543 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.315599 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-scripts\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.315629 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8267ff8-754f-446f-a7b1-94462f0f9c93-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.315672 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8267ff8-754f-446f-a7b1-94462f0f9c93-logs\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.315715 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-config-data-custom\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.315738 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-config-data\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.315846 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8267ff8-754f-446f-a7b1-94462f0f9c93-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.315849 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrvxl\" (UniqueName: \"kubernetes.io/projected/d8267ff8-754f-446f-a7b1-94462f0f9c93-kube-api-access-xrvxl\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.320221 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8267ff8-754f-446f-a7b1-94462f0f9c93-logs\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.322585 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.323964 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-config-data\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.324385 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-config-data-custom\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.326686 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-scripts\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.348989 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrvxl\" (UniqueName: \"kubernetes.io/projected/d8267ff8-754f-446f-a7b1-94462f0f9c93-kube-api-access-xrvxl\") pod \"cinder-api-0\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.440662 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-ml5pw"] Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.464452 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.566486 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57d468d6b8-4xwvd"] Mar 21 05:13:36 crc kubenswrapper[4580]: I0321 05:13:36.714878 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c"] Mar 21 05:13:36 crc kubenswrapper[4580]: W0321 05:13:36.778333 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f4b4bf3_0508_4021_916b_97694fe670ff.slice/crio-eb47fda41010ff7f7ab9fc4ce87a5c220506117dc4f3aa8a260a06b30502de5b WatchSource:0}: Error finding container eb47fda41010ff7f7ab9fc4ce87a5c220506117dc4f3aa8a260a06b30502de5b: Status 404 returned error can't find the container with id eb47fda41010ff7f7ab9fc4ce87a5c220506117dc4f3aa8a260a06b30502de5b Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.101454 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-787f545779-9db4b"] Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.128136 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:13:37 crc kubenswrapper[4580]: W0321 05:13:37.169108 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83350763_3294_4c10_8bf0_531aec2e110f.slice/crio-59d6ef2c10096824b21dcd8cf16ffd91233bdb04ace7bcca0f5e080ec08e9829 WatchSource:0}: Error finding container 59d6ef2c10096824b21dcd8cf16ffd91233bdb04ace7bcca0f5e080ec08e9829: Status 404 returned error can't find the container with id 59d6ef2c10096824b21dcd8cf16ffd91233bdb04ace7bcca0f5e080ec08e9829 Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.276953 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bbbf4f7bb-rvhzb"] Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.403707 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" event={"ID":"0f4b4bf3-0508-4021-916b-97694fe670ff","Type":"ContainerStarted","Data":"eb47fda41010ff7f7ab9fc4ce87a5c220506117dc4f3aa8a260a06b30502de5b"} Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.405264 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"83350763-3294-4c10-8bf0-531aec2e110f","Type":"ContainerStarted","Data":"59d6ef2c10096824b21dcd8cf16ffd91233bdb04ace7bcca0f5e080ec08e9829"} Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.406239 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-787f545779-9db4b" event={"ID":"d79bd04a-35d0-48ab-883f-982e3129d435","Type":"ContainerStarted","Data":"8b2d0b011eea6ebff4a6af24691137cde70587bea6b0a8a89a34981e8010f3f9"} Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.413560 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-28d7v"] Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.430111 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57d468d6b8-4xwvd" event={"ID":"33984e31-23ff-4d28-9828-74d12b7fc0a7","Type":"ContainerStarted","Data":"d13a268270d73439428136cc30f637118603cadce1019c8dd25bc9f0ac7c91d1"} Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.430376 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57d468d6b8-4xwvd" event={"ID":"33984e31-23ff-4d28-9828-74d12b7fc0a7","Type":"ContainerStarted","Data":"1885fea2f8dae75f1673f3b37957d723ebf32926c2610642db44e5b549a9ae1e"} Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.432204 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" event={"ID":"3ff1f3d4-eada-4182-9a04-48e23f84d11d","Type":"ContainerStarted","Data":"73aba58797f134b331641284137d4d69b627c6ddb3a62152104aca66c651ef6b"} Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.436586 4580 generic.go:334] "Generic (PLEG): container finished" podID="a2cc638c-f6d4-4016-b100-9d327b65065c" containerID="dcd8a97997983f984f6e5091f400bc27b75fcd7a18fe95f08ba34122be406d8b" exitCode=0 Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.436614 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" event={"ID":"a2cc638c-f6d4-4016-b100-9d327b65065c","Type":"ContainerDied","Data":"dcd8a97997983f984f6e5091f400bc27b75fcd7a18fe95f08ba34122be406d8b"} Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.436634 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" event={"ID":"a2cc638c-f6d4-4016-b100-9d327b65065c","Type":"ContainerStarted","Data":"5ea8087c5160e37e1177c7a19c4e6e13f45c1f9a7d2b28ab87df2f1d973f49ed"} Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.546024 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.741974 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-5tp6t" podUID="cf72524d-be2d-4051-a966-4d0cbfb2523e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: i/o timeout" Mar 21 05:13:37 crc kubenswrapper[4580]: I0321 05:13:37.987621 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.356777 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.431601 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-dns-svc\") pod \"a2cc638c-f6d4-4016-b100-9d327b65065c\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.431710 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-ovsdbserver-nb\") pod \"a2cc638c-f6d4-4016-b100-9d327b65065c\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.431762 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-dns-swift-storage-0\") pod \"a2cc638c-f6d4-4016-b100-9d327b65065c\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.431971 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmspb\" (UniqueName: \"kubernetes.io/projected/a2cc638c-f6d4-4016-b100-9d327b65065c-kube-api-access-fmspb\") pod \"a2cc638c-f6d4-4016-b100-9d327b65065c\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.432008 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-config\") pod \"a2cc638c-f6d4-4016-b100-9d327b65065c\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.433439 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-ovsdbserver-sb\") pod \"a2cc638c-f6d4-4016-b100-9d327b65065c\" (UID: \"a2cc638c-f6d4-4016-b100-9d327b65065c\") " Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.485208 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2cc638c-f6d4-4016-b100-9d327b65065c-kube-api-access-fmspb" (OuterVolumeSpecName: "kube-api-access-fmspb") pod "a2cc638c-f6d4-4016-b100-9d327b65065c" (UID: "a2cc638c-f6d4-4016-b100-9d327b65065c"). InnerVolumeSpecName "kube-api-access-fmspb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.505741 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a2cc638c-f6d4-4016-b100-9d327b65065c" (UID: "a2cc638c-f6d4-4016-b100-9d327b65065c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.519114 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a2cc638c-f6d4-4016-b100-9d327b65065c" (UID: "a2cc638c-f6d4-4016-b100-9d327b65065c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.520050 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d8267ff8-754f-446f-a7b1-94462f0f9c93","Type":"ContainerStarted","Data":"e738bd822cecca4278152455525905e466acaf73a6a97414c28eca960b5f3615"} Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.521064 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-config" (OuterVolumeSpecName: "config") pod "a2cc638c-f6d4-4016-b100-9d327b65065c" (UID: "a2cc638c-f6d4-4016-b100-9d327b65065c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.522600 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57d468d6b8-4xwvd" event={"ID":"33984e31-23ff-4d28-9828-74d12b7fc0a7","Type":"ContainerStarted","Data":"d79d9b98b19c66dc420061e3d3ea414cafbef44bb76a885281ddb24f30144e6c"} Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.522976 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.550857 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a2cc638c-f6d4-4016-b100-9d327b65065c" (UID: "a2cc638c-f6d4-4016-b100-9d327b65065c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.555849 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmspb\" (UniqueName: \"kubernetes.io/projected/a2cc638c-f6d4-4016-b100-9d327b65065c-kube-api-access-fmspb\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.555891 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.555915 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.555930 4580 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.555942 4580 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.589326 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-57d468d6b8-4xwvd" podStartSLOduration=4.589301124 podStartE2EDuration="4.589301124s" podCreationTimestamp="2026-03-21 05:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:13:38.580579832 +0000 UTC m=+1323.663163470" watchObservedRunningTime="2026-03-21 05:13:38.589301124 +0000 UTC m=+1323.671884752" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.592964 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" event={"ID":"3ff1f3d4-eada-4182-9a04-48e23f84d11d","Type":"ContainerStarted","Data":"01a88e01d040064cfcb12f3cab4edb81eb398cca95d442bf18ec54af79c671d3"} Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.621579 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" event={"ID":"a2cc638c-f6d4-4016-b100-9d327b65065c","Type":"ContainerDied","Data":"5ea8087c5160e37e1177c7a19c4e6e13f45c1f9a7d2b28ab87df2f1d973f49ed"} Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.623150 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-ml5pw" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.623267 4580 scope.go:117] "RemoveContainer" containerID="dcd8a97997983f984f6e5091f400bc27b75fcd7a18fe95f08ba34122be406d8b" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.633436 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a2cc638c-f6d4-4016-b100-9d327b65065c" (UID: "a2cc638c-f6d4-4016-b100-9d327b65065c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.634083 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" event={"ID":"03e4309f-b795-4c00-8058-616430f6ea8a","Type":"ContainerStarted","Data":"7dcb1b7a7f920f07fde018a93c10c8821c95a73ee12fb049f533da5edadc5771"} Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.634131 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" event={"ID":"03e4309f-b795-4c00-8058-616430f6ea8a","Type":"ContainerStarted","Data":"9602c4f1b928dce3f6d73f264d3ce38b7d5063b4528308525f46a816767918c1"} Mar 21 05:13:38 crc kubenswrapper[4580]: I0321 05:13:38.663513 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2cc638c-f6d4-4016-b100-9d327b65065c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:39 crc kubenswrapper[4580]: I0321 05:13:39.000099 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-ml5pw"] Mar 21 05:13:39 crc kubenswrapper[4580]: I0321 05:13:39.009139 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-ml5pw"] Mar 21 05:13:39 crc kubenswrapper[4580]: I0321 05:13:39.644331 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2cc638c-f6d4-4016-b100-9d327b65065c" path="/var/lib/kubelet/pods/a2cc638c-f6d4-4016-b100-9d327b65065c/volumes" Mar 21 05:13:39 crc kubenswrapper[4580]: I0321 05:13:39.653528 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" event={"ID":"3ff1f3d4-eada-4182-9a04-48e23f84d11d","Type":"ContainerStarted","Data":"e0c8d817168a9754c07cec9b3d634ccc2dbd8f5fa7d189eee3ad126d5214d84b"} Mar 21 05:13:39 crc kubenswrapper[4580]: I0321 05:13:39.655057 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:39 crc kubenswrapper[4580]: I0321 05:13:39.655104 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:39 crc kubenswrapper[4580]: I0321 05:13:39.691869 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" podStartSLOduration=4.69184487 podStartE2EDuration="4.69184487s" podCreationTimestamp="2026-03-21 05:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:13:39.679848591 +0000 UTC m=+1324.762432219" watchObservedRunningTime="2026-03-21 05:13:39.69184487 +0000 UTC m=+1324.774428498" Mar 21 05:13:39 crc kubenswrapper[4580]: I0321 05:13:39.699102 4580 generic.go:334] "Generic (PLEG): container finished" podID="03e4309f-b795-4c00-8058-616430f6ea8a" containerID="7dcb1b7a7f920f07fde018a93c10c8821c95a73ee12fb049f533da5edadc5771" exitCode=0 Mar 21 05:13:39 crc kubenswrapper[4580]: I0321 05:13:39.699272 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" event={"ID":"03e4309f-b795-4c00-8058-616430f6ea8a","Type":"ContainerDied","Data":"7dcb1b7a7f920f07fde018a93c10c8821c95a73ee12fb049f533da5edadc5771"} Mar 21 05:13:39 crc kubenswrapper[4580]: I0321 05:13:39.699342 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" event={"ID":"03e4309f-b795-4c00-8058-616430f6ea8a","Type":"ContainerStarted","Data":"8f421d9f786f076c9ee36dd8e72f6279949d0c1c7497da67bb7754fe4da7be07"} Mar 21 05:13:39 crc kubenswrapper[4580]: I0321 05:13:39.745993 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" podStartSLOduration=4.745966049 podStartE2EDuration="4.745966049s" podCreationTimestamp="2026-03-21 05:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:13:39.730076927 +0000 UTC m=+1324.812660575" watchObservedRunningTime="2026-03-21 05:13:39.745966049 +0000 UTC m=+1324.828549677" Mar 21 05:13:40 crc kubenswrapper[4580]: I0321 05:13:40.739866 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d8267ff8-754f-446f-a7b1-94462f0f9c93","Type":"ContainerStarted","Data":"9813aa78da4b6d40fe72511148ffa7827baa3bca2fdcafe18f4d6af83fe0fbb2"} Mar 21 05:13:40 crc kubenswrapper[4580]: I0321 05:13:40.744823 4580 generic.go:334] "Generic (PLEG): container finished" podID="c311e091-7cf1-426b-9788-a3d64b198e43" containerID="2ee092012d167f5b248abd488e907d058d7e1ad76a3a5daf34599a61962fa2b6" exitCode=0 Mar 21 05:13:40 crc kubenswrapper[4580]: I0321 05:13:40.744949 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c311e091-7cf1-426b-9788-a3d64b198e43","Type":"ContainerDied","Data":"2ee092012d167f5b248abd488e907d058d7e1ad76a3a5daf34599a61962fa2b6"} Mar 21 05:13:40 crc kubenswrapper[4580]: I0321 05:13:40.746931 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"83350763-3294-4c10-8bf0-531aec2e110f","Type":"ContainerStarted","Data":"6632c12738d4adfcd4405a097615470fb959b76f6c490c8a0639ce26b685ea70"} Mar 21 05:13:40 crc kubenswrapper[4580]: I0321 05:13:40.747901 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:41 crc kubenswrapper[4580]: I0321 05:13:41.397396 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:13:41 crc kubenswrapper[4580]: I0321 05:13:41.397769 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:13:41 crc kubenswrapper[4580]: I0321 05:13:41.398559 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"19fb47284615f4db4d3ee3b8a1bb2963d50724cdbe63d92f0b19442506b6bf5b"} pod="openstack/horizon-587cfc8688-265kc" containerMessage="Container horizon failed startup probe, will be restarted" Mar 21 05:13:41 crc kubenswrapper[4580]: I0321 05:13:41.398598 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" containerID="cri-o://19fb47284615f4db4d3ee3b8a1bb2963d50724cdbe63d92f0b19442506b6bf5b" gracePeriod=30 Mar 21 05:13:41 crc kubenswrapper[4580]: I0321 05:13:41.507609 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 21 05:13:41 crc kubenswrapper[4580]: I0321 05:13:41.507696 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:13:41 crc kubenswrapper[4580]: I0321 05:13:41.508629 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"b03c7b6b3d34260bff0a00bc798a52da6836ea0ee76b7c6df6980b8c29af49eb"} pod="openstack/horizon-67655f8b6-mbx6n" containerMessage="Container horizon failed startup probe, will be restarted" Mar 21 05:13:41 crc kubenswrapper[4580]: I0321 05:13:41.508689 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" containerID="cri-o://b03c7b6b3d34260bff0a00bc798a52da6836ea0ee76b7c6df6980b8c29af49eb" gracePeriod=30 Mar 21 05:13:41 crc kubenswrapper[4580]: I0321 05:13:41.906458 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6cd755485-pmnqc"] Mar 21 05:13:41 crc kubenswrapper[4580]: E0321 05:13:41.906984 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2cc638c-f6d4-4016-b100-9d327b65065c" containerName="init" Mar 21 05:13:41 crc kubenswrapper[4580]: I0321 05:13:41.907001 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2cc638c-f6d4-4016-b100-9d327b65065c" containerName="init" Mar 21 05:13:41 crc kubenswrapper[4580]: I0321 05:13:41.907199 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2cc638c-f6d4-4016-b100-9d327b65065c" containerName="init" Mar 21 05:13:41 crc kubenswrapper[4580]: I0321 05:13:41.908361 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:41 crc kubenswrapper[4580]: I0321 05:13:41.911913 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 21 05:13:41 crc kubenswrapper[4580]: I0321 05:13:41.912208 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 21 05:13:41 crc kubenswrapper[4580]: I0321 05:13:41.961165 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6cd755485-pmnqc"] Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.062945 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-config\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.063331 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-httpd-config\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.063406 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-public-tls-certs\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.063445 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-combined-ca-bundle\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.063511 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-ovndb-tls-certs\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.063594 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjwdh\" (UniqueName: \"kubernetes.io/projected/0804de84-fb1f-40cf-af99-b67d2eb64fc4-kube-api-access-fjwdh\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.063652 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-internal-tls-certs\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.164645 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-ovndb-tls-certs\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.165196 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjwdh\" (UniqueName: \"kubernetes.io/projected/0804de84-fb1f-40cf-af99-b67d2eb64fc4-kube-api-access-fjwdh\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.165346 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-internal-tls-certs\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.166144 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-config\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.166309 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-httpd-config\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.166427 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-public-tls-certs\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.166541 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-combined-ca-bundle\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.175765 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-internal-tls-certs\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.176507 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-public-tls-certs\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.179353 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-combined-ca-bundle\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.179550 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-ovndb-tls-certs\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.180370 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-httpd-config\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.180372 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0804de84-fb1f-40cf-af99-b67d2eb64fc4-config\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.194708 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjwdh\" (UniqueName: \"kubernetes.io/projected/0804de84-fb1f-40cf-af99-b67d2eb64fc4-kube-api-access-fjwdh\") pod \"neutron-6cd755485-pmnqc\" (UID: \"0804de84-fb1f-40cf-af99-b67d2eb64fc4\") " pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.286798 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.291462 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.369029 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c311e091-7cf1-426b-9788-a3d64b198e43-log-httpd\") pod \"c311e091-7cf1-426b-9788-a3d64b198e43\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.369119 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-scripts\") pod \"c311e091-7cf1-426b-9788-a3d64b198e43\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.369163 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-sg-core-conf-yaml\") pod \"c311e091-7cf1-426b-9788-a3d64b198e43\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.369224 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-combined-ca-bundle\") pod \"c311e091-7cf1-426b-9788-a3d64b198e43\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.369277 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-config-data\") pod \"c311e091-7cf1-426b-9788-a3d64b198e43\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.369355 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5v52\" (UniqueName: \"kubernetes.io/projected/c311e091-7cf1-426b-9788-a3d64b198e43-kube-api-access-m5v52\") pod \"c311e091-7cf1-426b-9788-a3d64b198e43\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.369396 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c311e091-7cf1-426b-9788-a3d64b198e43-run-httpd\") pod \"c311e091-7cf1-426b-9788-a3d64b198e43\" (UID: \"c311e091-7cf1-426b-9788-a3d64b198e43\") " Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.370029 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c311e091-7cf1-426b-9788-a3d64b198e43-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c311e091-7cf1-426b-9788-a3d64b198e43" (UID: "c311e091-7cf1-426b-9788-a3d64b198e43"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.371224 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c311e091-7cf1-426b-9788-a3d64b198e43-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c311e091-7cf1-426b-9788-a3d64b198e43" (UID: "c311e091-7cf1-426b-9788-a3d64b198e43"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.377103 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c311e091-7cf1-426b-9788-a3d64b198e43-kube-api-access-m5v52" (OuterVolumeSpecName: "kube-api-access-m5v52") pod "c311e091-7cf1-426b-9788-a3d64b198e43" (UID: "c311e091-7cf1-426b-9788-a3d64b198e43"). InnerVolumeSpecName "kube-api-access-m5v52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.377264 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-scripts" (OuterVolumeSpecName: "scripts") pod "c311e091-7cf1-426b-9788-a3d64b198e43" (UID: "c311e091-7cf1-426b-9788-a3d64b198e43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.406405 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c311e091-7cf1-426b-9788-a3d64b198e43" (UID: "c311e091-7cf1-426b-9788-a3d64b198e43"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.434851 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c311e091-7cf1-426b-9788-a3d64b198e43" (UID: "c311e091-7cf1-426b-9788-a3d64b198e43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.461033 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-config-data" (OuterVolumeSpecName: "config-data") pod "c311e091-7cf1-426b-9788-a3d64b198e43" (UID: "c311e091-7cf1-426b-9788-a3d64b198e43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.471927 4580 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c311e091-7cf1-426b-9788-a3d64b198e43-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.471968 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.471980 4580 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.471995 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.472005 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c311e091-7cf1-426b-9788-a3d64b198e43-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.472015 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5v52\" (UniqueName: \"kubernetes.io/projected/c311e091-7cf1-426b-9788-a3d64b198e43-kube-api-access-m5v52\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.472028 4580 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c311e091-7cf1-426b-9788-a3d64b198e43-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.802036 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c311e091-7cf1-426b-9788-a3d64b198e43","Type":"ContainerDied","Data":"8a6e526f983bd78cd69797adaa7f597120eb814df47b399d7701ac69c847533d"} Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.802312 4580 scope.go:117] "RemoveContainer" containerID="ea1247a808b934e0c8f07cf7b669ca9c2f7c2f17267d5c74df208caeb313ea12" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.802089 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.960873 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:13:42 crc kubenswrapper[4580]: I0321 05:13:42.978400 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.036566 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:13:43 crc kubenswrapper[4580]: E0321 05:13:43.037085 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c311e091-7cf1-426b-9788-a3d64b198e43" containerName="ceilometer-notification-agent" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.037102 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c311e091-7cf1-426b-9788-a3d64b198e43" containerName="ceilometer-notification-agent" Mar 21 05:13:43 crc kubenswrapper[4580]: E0321 05:13:43.037161 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c311e091-7cf1-426b-9788-a3d64b198e43" containerName="proxy-httpd" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.037168 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c311e091-7cf1-426b-9788-a3d64b198e43" containerName="proxy-httpd" Mar 21 05:13:43 crc kubenswrapper[4580]: E0321 05:13:43.037177 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c311e091-7cf1-426b-9788-a3d64b198e43" containerName="sg-core" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.037183 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c311e091-7cf1-426b-9788-a3d64b198e43" containerName="sg-core" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.037503 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="c311e091-7cf1-426b-9788-a3d64b198e43" containerName="ceilometer-notification-agent" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.037524 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="c311e091-7cf1-426b-9788-a3d64b198e43" containerName="sg-core" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.037533 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="c311e091-7cf1-426b-9788-a3d64b198e43" containerName="proxy-httpd" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.039751 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.050250 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.050718 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.060400 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.162656 4580 scope.go:117] "RemoveContainer" containerID="79b346aa629f5e0c52009561345e629f878a979fb3866889dc45c60a0a077a68" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.193357 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm6vf\" (UniqueName: \"kubernetes.io/projected/0694bb24-df06-41c3-a24d-8428090b6df4-kube-api-access-zm6vf\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.193410 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.193607 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-config-data\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.193702 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0694bb24-df06-41c3-a24d-8428090b6df4-run-httpd\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.193744 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.193771 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0694bb24-df06-41c3-a24d-8428090b6df4-log-httpd\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.193837 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-scripts\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.298948 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.299011 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0694bb24-df06-41c3-a24d-8428090b6df4-log-httpd\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.299082 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-scripts\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.299108 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm6vf\" (UniqueName: \"kubernetes.io/projected/0694bb24-df06-41c3-a24d-8428090b6df4-kube-api-access-zm6vf\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.299127 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.299288 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-config-data\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.299411 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0694bb24-df06-41c3-a24d-8428090b6df4-run-httpd\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.300413 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0694bb24-df06-41c3-a24d-8428090b6df4-run-httpd\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.315252 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0694bb24-df06-41c3-a24d-8428090b6df4-log-httpd\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.339137 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-config-data\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.344923 4580 scope.go:117] "RemoveContainer" containerID="2ee092012d167f5b248abd488e907d058d7e1ad76a3a5daf34599a61962fa2b6" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.354744 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm6vf\" (UniqueName: \"kubernetes.io/projected/0694bb24-df06-41c3-a24d-8428090b6df4-kube-api-access-zm6vf\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.358469 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-scripts\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.375136 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.375917 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.424487 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.698305 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c311e091-7cf1-426b-9788-a3d64b198e43" path="/var/lib/kubelet/pods/c311e091-7cf1-426b-9788-a3d64b198e43/volumes" Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.770848 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6cd755485-pmnqc"] Mar 21 05:13:43 crc kubenswrapper[4580]: W0321 05:13:43.825902 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0804de84_fb1f_40cf_af99_b67d2eb64fc4.slice/crio-edc484cb82c61f6f40833f5c24c2db7b15c1571c93715a693ee59da711f89fb3 WatchSource:0}: Error finding container edc484cb82c61f6f40833f5c24c2db7b15c1571c93715a693ee59da711f89fb3: Status 404 returned error can't find the container with id edc484cb82c61f6f40833f5c24c2db7b15c1571c93715a693ee59da711f89fb3 Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.871546 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-787f545779-9db4b" event={"ID":"d79bd04a-35d0-48ab-883f-982e3129d435","Type":"ContainerStarted","Data":"fc2a121aaf7a3dbb76e772de50cb5c8663440670eb5fa3275946576d5843ecd5"} Mar 21 05:13:43 crc kubenswrapper[4580]: I0321 05:13:43.907572 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" event={"ID":"0f4b4bf3-0508-4021-916b-97694fe670ff","Type":"ContainerStarted","Data":"8c79ef321ebcc45eabf12b41bf2ab0be609286d8c1a4dda7a65e78fc726bf73f"} Mar 21 05:13:44 crc kubenswrapper[4580]: I0321 05:13:44.090598 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:13:44 crc kubenswrapper[4580]: W0321 05:13:44.114092 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0694bb24_df06_41c3_a24d_8428090b6df4.slice/crio-b1e89d7d04d9886474f61f3afde2b48891dabecaf03e02ea3ad4092a37ba7e47 WatchSource:0}: Error finding container b1e89d7d04d9886474f61f3afde2b48891dabecaf03e02ea3ad4092a37ba7e47: Status 404 returned error can't find the container with id b1e89d7d04d9886474f61f3afde2b48891dabecaf03e02ea3ad4092a37ba7e47 Mar 21 05:13:44 crc kubenswrapper[4580]: I0321 05:13:44.925973 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-787f545779-9db4b" event={"ID":"d79bd04a-35d0-48ab-883f-982e3129d435","Type":"ContainerStarted","Data":"bfa4d2bd6ea8e56909e619e504a3e1e14a614a2ee40346822b5f4fdce9bba262"} Mar 21 05:13:44 crc kubenswrapper[4580]: I0321 05:13:44.935685 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0694bb24-df06-41c3-a24d-8428090b6df4","Type":"ContainerStarted","Data":"b1e89d7d04d9886474f61f3afde2b48891dabecaf03e02ea3ad4092a37ba7e47"} Mar 21 05:13:44 crc kubenswrapper[4580]: I0321 05:13:44.944117 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" event={"ID":"0f4b4bf3-0508-4021-916b-97694fe670ff","Type":"ContainerStarted","Data":"c029ed9d72f9244a7b010db31cb4cb9db8454931b0d6799edbb6f5070cbb12fc"} Mar 21 05:13:44 crc kubenswrapper[4580]: I0321 05:13:44.948737 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"83350763-3294-4c10-8bf0-531aec2e110f","Type":"ContainerStarted","Data":"41882dc112c9d9ec4fb4bc5c090d9686f599e7ceb42e70ff547b756470fb9ac3"} Mar 21 05:13:44 crc kubenswrapper[4580]: I0321 05:13:44.975294 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cd755485-pmnqc" event={"ID":"0804de84-fb1f-40cf-af99-b67d2eb64fc4","Type":"ContainerStarted","Data":"09747e491e23201d3aa30e27883ec5976ca51c961ac5022a5330ceae39d5935e"} Mar 21 05:13:44 crc kubenswrapper[4580]: I0321 05:13:44.975345 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cd755485-pmnqc" event={"ID":"0804de84-fb1f-40cf-af99-b67d2eb64fc4","Type":"ContainerStarted","Data":"edc484cb82c61f6f40833f5c24c2db7b15c1571c93715a693ee59da711f89fb3"} Mar 21 05:13:45 crc kubenswrapper[4580]: I0321 05:13:45.004047 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-787f545779-9db4b" podStartSLOduration=5.335725321 podStartE2EDuration="11.004023368s" podCreationTimestamp="2026-03-21 05:13:34 +0000 UTC" firstStartedPulling="2026-03-21 05:13:37.136103004 +0000 UTC m=+1322.218686632" lastFinishedPulling="2026-03-21 05:13:42.804401051 +0000 UTC m=+1327.886984679" observedRunningTime="2026-03-21 05:13:44.966381567 +0000 UTC m=+1330.048965205" watchObservedRunningTime="2026-03-21 05:13:45.004023368 +0000 UTC m=+1330.086607006" Mar 21 05:13:45 crc kubenswrapper[4580]: I0321 05:13:45.040628 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d8267ff8-754f-446f-a7b1-94462f0f9c93","Type":"ContainerStarted","Data":"c48256f4ad3c92fc98688db4c7fc3dbc3507966ff1f6f93b840cfa6e24de8913"} Mar 21 05:13:45 crc kubenswrapper[4580]: I0321 05:13:45.041532 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 21 05:13:45 crc kubenswrapper[4580]: I0321 05:13:45.041602 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d8267ff8-754f-446f-a7b1-94462f0f9c93" containerName="cinder-api-log" containerID="cri-o://9813aa78da4b6d40fe72511148ffa7827baa3bca2fdcafe18f4d6af83fe0fbb2" gracePeriod=30 Mar 21 05:13:45 crc kubenswrapper[4580]: I0321 05:13:45.041652 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d8267ff8-754f-446f-a7b1-94462f0f9c93" containerName="cinder-api" containerID="cri-o://c48256f4ad3c92fc98688db4c7fc3dbc3507966ff1f6f93b840cfa6e24de8913" gracePeriod=30 Mar 21 05:13:45 crc kubenswrapper[4580]: I0321 05:13:45.088384 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=9.366733414 podStartE2EDuration="11.08836221s" podCreationTimestamp="2026-03-21 05:13:34 +0000 UTC" firstStartedPulling="2026-03-21 05:13:37.178340578 +0000 UTC m=+1322.260924206" lastFinishedPulling="2026-03-21 05:13:38.899969374 +0000 UTC m=+1323.982553002" observedRunningTime="2026-03-21 05:13:45.010295054 +0000 UTC m=+1330.092878682" watchObservedRunningTime="2026-03-21 05:13:45.08836221 +0000 UTC m=+1330.170945838" Mar 21 05:13:45 crc kubenswrapper[4580]: I0321 05:13:45.115661 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7cfb6cbb9d-ln66c" podStartSLOduration=5.135517727 podStartE2EDuration="11.115639825s" podCreationTimestamp="2026-03-21 05:13:34 +0000 UTC" firstStartedPulling="2026-03-21 05:13:36.815463599 +0000 UTC m=+1321.898047227" lastFinishedPulling="2026-03-21 05:13:42.795585697 +0000 UTC m=+1327.878169325" observedRunningTime="2026-03-21 05:13:45.036124291 +0000 UTC m=+1330.118707929" watchObservedRunningTime="2026-03-21 05:13:45.115639825 +0000 UTC m=+1330.198223443" Mar 21 05:13:45 crc kubenswrapper[4580]: I0321 05:13:45.124653 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=10.124633115 podStartE2EDuration="10.124633115s" podCreationTimestamp="2026-03-21 05:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:13:45.08310629 +0000 UTC m=+1330.165689938" watchObservedRunningTime="2026-03-21 05:13:45.124633115 +0000 UTC m=+1330.207216743" Mar 21 05:13:45 crc kubenswrapper[4580]: I0321 05:13:45.382083 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 21 05:13:45 crc kubenswrapper[4580]: I0321 05:13:45.386924 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="83350763-3294-4c10-8bf0-531aec2e110f" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.160:8080/\": dial tcp 10.217.0.160:8080: connect: connection refused" Mar 21 05:13:45 crc kubenswrapper[4580]: I0321 05:13:45.953342 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:13:45 crc kubenswrapper[4580]: I0321 05:13:45.953635 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.053124 4580 generic.go:334] "Generic (PLEG): container finished" podID="d8267ff8-754f-446f-a7b1-94462f0f9c93" containerID="9813aa78da4b6d40fe72511148ffa7827baa3bca2fdcafe18f4d6af83fe0fbb2" exitCode=143 Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.053213 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d8267ff8-754f-446f-a7b1-94462f0f9c93","Type":"ContainerDied","Data":"9813aa78da4b6d40fe72511148ffa7827baa3bca2fdcafe18f4d6af83fe0fbb2"} Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.059973 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0694bb24-df06-41c3-a24d-8428090b6df4","Type":"ContainerStarted","Data":"1e32fbc06eb17d4fbb9d9fd6f9781c6231cf73fa6cc7275bb60b3fbd2ba6c4bf"} Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.060035 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0694bb24-df06-41c3-a24d-8428090b6df4","Type":"ContainerStarted","Data":"13cdf4bc25a2a4ff4f44203ac90f53c94b3c3f7df282de0ab9155d6b47bc2473"} Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.066003 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cd755485-pmnqc" event={"ID":"0804de84-fb1f-40cf-af99-b67d2eb64fc4","Type":"ContainerStarted","Data":"ae37dfbe51dd1f32e4bb551158f8ad19e9ef9ae34111c311974feeab8b778dd9"} Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.066561 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.105274 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6cd755485-pmnqc" podStartSLOduration=5.105254349 podStartE2EDuration="5.105254349s" podCreationTimestamp="2026-03-21 05:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:13:46.099114016 +0000 UTC m=+1331.181697684" watchObservedRunningTime="2026-03-21 05:13:46.105254349 +0000 UTC m=+1331.187837977" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.206999 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.306796 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gfzjq"] Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.314939 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" podUID="164fdcfb-ffa0-4152-b9e8-d3f29c16090c" containerName="dnsmasq-dns" containerID="cri-o://a4d3a88bdacd15870c94059cd5a3ac2c28c44040866ad4019864559db267a928" gracePeriod=10 Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.446372 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6d486fc764-m7r7b"] Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.453129 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.463215 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.464552 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.556102 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d486fc764-m7r7b"] Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.607961 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c225fecd-c259-40cb-898c-78dc724d1db8-config-data-custom\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.608237 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c225fecd-c259-40cb-898c-78dc724d1db8-config-data\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.608399 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c225fecd-c259-40cb-898c-78dc724d1db8-internal-tls-certs\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.608512 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c225fecd-c259-40cb-898c-78dc724d1db8-logs\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.608660 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnrhm\" (UniqueName: \"kubernetes.io/projected/c225fecd-c259-40cb-898c-78dc724d1db8-kube-api-access-cnrhm\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.608835 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c225fecd-c259-40cb-898c-78dc724d1db8-combined-ca-bundle\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.608923 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c225fecd-c259-40cb-898c-78dc724d1db8-public-tls-certs\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.710893 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c225fecd-c259-40cb-898c-78dc724d1db8-combined-ca-bundle\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.711401 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c225fecd-c259-40cb-898c-78dc724d1db8-public-tls-certs\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.711517 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c225fecd-c259-40cb-898c-78dc724d1db8-config-data-custom\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.711593 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c225fecd-c259-40cb-898c-78dc724d1db8-config-data\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.711714 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c225fecd-c259-40cb-898c-78dc724d1db8-internal-tls-certs\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.711841 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c225fecd-c259-40cb-898c-78dc724d1db8-logs\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.711967 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnrhm\" (UniqueName: \"kubernetes.io/projected/c225fecd-c259-40cb-898c-78dc724d1db8-kube-api-access-cnrhm\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.716832 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c225fecd-c259-40cb-898c-78dc724d1db8-logs\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.728557 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c225fecd-c259-40cb-898c-78dc724d1db8-public-tls-certs\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.733209 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c225fecd-c259-40cb-898c-78dc724d1db8-config-data\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.746468 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnrhm\" (UniqueName: \"kubernetes.io/projected/c225fecd-c259-40cb-898c-78dc724d1db8-kube-api-access-cnrhm\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.748486 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c225fecd-c259-40cb-898c-78dc724d1db8-combined-ca-bundle\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.770083 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c225fecd-c259-40cb-898c-78dc724d1db8-config-data-custom\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.770371 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c225fecd-c259-40cb-898c-78dc724d1db8-internal-tls-certs\") pod \"barbican-api-6d486fc764-m7r7b\" (UID: \"c225fecd-c259-40cb-898c-78dc724d1db8\") " pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.846942 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.847135 4580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.883538 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 05:13:46 crc kubenswrapper[4580]: I0321 05:13:46.929422 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.135272 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.146618 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0694bb24-df06-41c3-a24d-8428090b6df4","Type":"ContainerStarted","Data":"db67d4321b5cd6dafcdc1ed5a2a09ad8d3610f33d42492f866565584c063d3d2"} Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.173751 4580 generic.go:334] "Generic (PLEG): container finished" podID="164fdcfb-ffa0-4152-b9e8-d3f29c16090c" containerID="a4d3a88bdacd15870c94059cd5a3ac2c28c44040866ad4019864559db267a928" exitCode=0 Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.173858 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" event={"ID":"164fdcfb-ffa0-4152-b9e8-d3f29c16090c","Type":"ContainerDied","Data":"a4d3a88bdacd15870c94059cd5a3ac2c28c44040866ad4019864559db267a928"} Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.173922 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" event={"ID":"164fdcfb-ffa0-4152-b9e8-d3f29c16090c","Type":"ContainerDied","Data":"c329d6206f4e3e108b2f1bcccb07b5694138c8084bbd91a2b0ac850bab3f72c0"} Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.173943 4580 scope.go:117] "RemoveContainer" containerID="a4d3a88bdacd15870c94059cd5a3ac2c28c44040866ad4019864559db267a928" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.174249 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gfzjq" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.246355 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87sj9\" (UniqueName: \"kubernetes.io/projected/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-kube-api-access-87sj9\") pod \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.246790 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-dns-swift-storage-0\") pod \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.246819 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-ovsdbserver-nb\") pod \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.246875 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-config\") pod \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.246898 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-dns-svc\") pod \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.246927 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-ovsdbserver-sb\") pod \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\" (UID: \"164fdcfb-ffa0-4152-b9e8-d3f29c16090c\") " Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.258119 4580 scope.go:117] "RemoveContainer" containerID="928f56c5c23f263103d35fb9a339d137c928c02e81b63ac85da13955b63e8fe3" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.277025 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-kube-api-access-87sj9" (OuterVolumeSpecName: "kube-api-access-87sj9") pod "164fdcfb-ffa0-4152-b9e8-d3f29c16090c" (UID: "164fdcfb-ffa0-4152-b9e8-d3f29c16090c"). InnerVolumeSpecName "kube-api-access-87sj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.336511 4580 scope.go:117] "RemoveContainer" containerID="a4d3a88bdacd15870c94059cd5a3ac2c28c44040866ad4019864559db267a928" Mar 21 05:13:47 crc kubenswrapper[4580]: E0321 05:13:47.339202 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4d3a88bdacd15870c94059cd5a3ac2c28c44040866ad4019864559db267a928\": container with ID starting with a4d3a88bdacd15870c94059cd5a3ac2c28c44040866ad4019864559db267a928 not found: ID does not exist" containerID="a4d3a88bdacd15870c94059cd5a3ac2c28c44040866ad4019864559db267a928" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.339235 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4d3a88bdacd15870c94059cd5a3ac2c28c44040866ad4019864559db267a928"} err="failed to get container status \"a4d3a88bdacd15870c94059cd5a3ac2c28c44040866ad4019864559db267a928\": rpc error: code = NotFound desc = could not find container \"a4d3a88bdacd15870c94059cd5a3ac2c28c44040866ad4019864559db267a928\": container with ID starting with a4d3a88bdacd15870c94059cd5a3ac2c28c44040866ad4019864559db267a928 not found: ID does not exist" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.339259 4580 scope.go:117] "RemoveContainer" containerID="928f56c5c23f263103d35fb9a339d137c928c02e81b63ac85da13955b63e8fe3" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.350746 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87sj9\" (UniqueName: \"kubernetes.io/projected/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-kube-api-access-87sj9\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.377127 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.377299 4580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 05:13:47 crc kubenswrapper[4580]: E0321 05:13:47.400705 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"928f56c5c23f263103d35fb9a339d137c928c02e81b63ac85da13955b63e8fe3\": container with ID starting with 928f56c5c23f263103d35fb9a339d137c928c02e81b63ac85da13955b63e8fe3 not found: ID does not exist" containerID="928f56c5c23f263103d35fb9a339d137c928c02e81b63ac85da13955b63e8fe3" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.400755 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"928f56c5c23f263103d35fb9a339d137c928c02e81b63ac85da13955b63e8fe3"} err="failed to get container status \"928f56c5c23f263103d35fb9a339d137c928c02e81b63ac85da13955b63e8fe3\": rpc error: code = NotFound desc = could not find container \"928f56c5c23f263103d35fb9a339d137c928c02e81b63ac85da13955b63e8fe3\": container with ID starting with 928f56c5c23f263103d35fb9a339d137c928c02e81b63ac85da13955b63e8fe3 not found: ID does not exist" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.414758 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-config" (OuterVolumeSpecName: "config") pod "164fdcfb-ffa0-4152-b9e8-d3f29c16090c" (UID: "164fdcfb-ffa0-4152-b9e8-d3f29c16090c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.428153 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.453187 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.557341 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "164fdcfb-ffa0-4152-b9e8-d3f29c16090c" (UID: "164fdcfb-ffa0-4152-b9e8-d3f29c16090c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.559277 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.572536 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "164fdcfb-ffa0-4152-b9e8-d3f29c16090c" (UID: "164fdcfb-ffa0-4152-b9e8-d3f29c16090c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.623024 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "164fdcfb-ffa0-4152-b9e8-d3f29c16090c" (UID: "164fdcfb-ffa0-4152-b9e8-d3f29c16090c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.651934 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "164fdcfb-ffa0-4152-b9e8-d3f29c16090c" (UID: "164fdcfb-ffa0-4152-b9e8-d3f29c16090c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.663948 4580 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.663983 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.663992 4580 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/164fdcfb-ffa0-4152-b9e8-d3f29c16090c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.827593 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gfzjq"] Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.840258 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gfzjq"] Mar 21 05:13:47 crc kubenswrapper[4580]: I0321 05:13:47.869759 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d486fc764-m7r7b"] Mar 21 05:13:48 crc kubenswrapper[4580]: I0321 05:13:48.188340 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d486fc764-m7r7b" event={"ID":"c225fecd-c259-40cb-898c-78dc724d1db8","Type":"ContainerStarted","Data":"886544ca2b70c8739e800f037beceeb00cf8d193d7c005389086e4d3c46dc6de"} Mar 21 05:13:49 crc kubenswrapper[4580]: I0321 05:13:49.210298 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d486fc764-m7r7b" event={"ID":"c225fecd-c259-40cb-898c-78dc724d1db8","Type":"ContainerStarted","Data":"4e41942e22f2262fe276d76d6a16f06c080182b4229fb5b356071103b145ef7f"} Mar 21 05:13:49 crc kubenswrapper[4580]: I0321 05:13:49.210757 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d486fc764-m7r7b" event={"ID":"c225fecd-c259-40cb-898c-78dc724d1db8","Type":"ContainerStarted","Data":"f6577ffcd7d836119101bad0eb29b0052e91ed61ae250119e4c21458733f297b"} Mar 21 05:13:49 crc kubenswrapper[4580]: I0321 05:13:49.211218 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:49 crc kubenswrapper[4580]: I0321 05:13:49.211248 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:49 crc kubenswrapper[4580]: I0321 05:13:49.221947 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0694bb24-df06-41c3-a24d-8428090b6df4","Type":"ContainerStarted","Data":"b83fd3821ac6334371330cc325afa955d8d19b6906ad78933d2d62d6b4ec28eb"} Mar 21 05:13:49 crc kubenswrapper[4580]: I0321 05:13:49.222233 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 05:13:49 crc kubenswrapper[4580]: I0321 05:13:49.234389 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6d486fc764-m7r7b" podStartSLOduration=3.23436647 podStartE2EDuration="3.23436647s" podCreationTimestamp="2026-03-21 05:13:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:13:49.230399725 +0000 UTC m=+1334.312983373" watchObservedRunningTime="2026-03-21 05:13:49.23436647 +0000 UTC m=+1334.316950108" Mar 21 05:13:49 crc kubenswrapper[4580]: I0321 05:13:49.272755 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.69228856 podStartE2EDuration="7.27272919s" podCreationTimestamp="2026-03-21 05:13:42 +0000 UTC" firstStartedPulling="2026-03-21 05:13:44.121878163 +0000 UTC m=+1329.204461781" lastFinishedPulling="2026-03-21 05:13:48.702318773 +0000 UTC m=+1333.784902411" observedRunningTime="2026-03-21 05:13:49.257033683 +0000 UTC m=+1334.339617331" watchObservedRunningTime="2026-03-21 05:13:49.27272919 +0000 UTC m=+1334.355312818" Mar 21 05:13:49 crc kubenswrapper[4580]: I0321 05:13:49.628445 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="164fdcfb-ffa0-4152-b9e8-d3f29c16090c" path="/var/lib/kubelet/pods/164fdcfb-ffa0-4152-b9e8-d3f29c16090c/volumes" Mar 21 05:13:50 crc kubenswrapper[4580]: I0321 05:13:50.094041 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" podUID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:13:50 crc kubenswrapper[4580]: I0321 05:13:50.094905 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" podUID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:13:50 crc kubenswrapper[4580]: I0321 05:13:50.810188 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 21 05:13:50 crc kubenswrapper[4580]: I0321 05:13:50.848031 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:13:51 crc kubenswrapper[4580]: I0321 05:13:51.074028 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" podUID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:13:51 crc kubenswrapper[4580]: I0321 05:13:51.074022 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" podUID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:13:51 crc kubenswrapper[4580]: I0321 05:13:51.239006 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="83350763-3294-4c10-8bf0-531aec2e110f" containerName="cinder-scheduler" containerID="cri-o://6632c12738d4adfcd4405a097615470fb959b76f6c490c8a0639ce26b685ea70" gracePeriod=30 Mar 21 05:13:51 crc kubenswrapper[4580]: I0321 05:13:51.239060 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="83350763-3294-4c10-8bf0-531aec2e110f" containerName="probe" containerID="cri-o://41882dc112c9d9ec4fb4bc5c090d9686f599e7ceb42e70ff547b756470fb9ac3" gracePeriod=30 Mar 21 05:13:52 crc kubenswrapper[4580]: I0321 05:13:52.251845 4580 generic.go:334] "Generic (PLEG): container finished" podID="83350763-3294-4c10-8bf0-531aec2e110f" containerID="41882dc112c9d9ec4fb4bc5c090d9686f599e7ceb42e70ff547b756470fb9ac3" exitCode=0 Mar 21 05:13:52 crc kubenswrapper[4580]: I0321 05:13:52.251883 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"83350763-3294-4c10-8bf0-531aec2e110f","Type":"ContainerDied","Data":"41882dc112c9d9ec4fb4bc5c090d9686f599e7ceb42e70ff547b756470fb9ac3"} Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.194953 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.349723 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.658221 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-67f74b898d-dtzvd"] Mar 21 05:13:53 crc kubenswrapper[4580]: E0321 05:13:53.662927 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164fdcfb-ffa0-4152-b9e8-d3f29c16090c" containerName="dnsmasq-dns" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.663258 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="164fdcfb-ffa0-4152-b9e8-d3f29c16090c" containerName="dnsmasq-dns" Mar 21 05:13:53 crc kubenswrapper[4580]: E0321 05:13:53.663359 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164fdcfb-ffa0-4152-b9e8-d3f29c16090c" containerName="init" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.663438 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="164fdcfb-ffa0-4152-b9e8-d3f29c16090c" containerName="init" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.664334 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="164fdcfb-ffa0-4152-b9e8-d3f29c16090c" containerName="dnsmasq-dns" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.666647 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.765937 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67f74b898d-dtzvd"] Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.788315 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-logs\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.788453 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-combined-ca-bundle\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.788482 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-scripts\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.788615 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f6wv\" (UniqueName: \"kubernetes.io/projected/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-kube-api-access-7f6wv\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.788724 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-internal-tls-certs\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.788808 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-config-data\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.788865 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-public-tls-certs\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.890982 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f6wv\" (UniqueName: \"kubernetes.io/projected/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-kube-api-access-7f6wv\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.891081 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-internal-tls-certs\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.891128 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-config-data\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.891170 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-public-tls-certs\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.891242 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-logs\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.891291 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-combined-ca-bundle\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.891315 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-scripts\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.892133 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-logs\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.905312 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-scripts\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.905532 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-public-tls-certs\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.908685 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-internal-tls-certs\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.912579 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-combined-ca-bundle\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.913535 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-config-data\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.929474 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f6wv\" (UniqueName: \"kubernetes.io/projected/3cbb1901-c5ee-4f46-aa6d-ac31372a9b83-kube-api-access-7f6wv\") pod \"placement-67f74b898d-dtzvd\" (UID: \"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83\") " pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:53 crc kubenswrapper[4580]: I0321 05:13:53.987723 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:54 crc kubenswrapper[4580]: I0321 05:13:54.015244 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:54 crc kubenswrapper[4580]: I0321 05:13:54.579437 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67f74b898d-dtzvd"] Mar 21 05:13:54 crc kubenswrapper[4580]: W0321 05:13:54.598989 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cbb1901_c5ee_4f46_aa6d_ac31372a9b83.slice/crio-dad7bfc7474ff406373a412967a7ec0235e1f24dfbbe2bf0540e97676bd412fa WatchSource:0}: Error finding container dad7bfc7474ff406373a412967a7ec0235e1f24dfbbe2bf0540e97676bd412fa: Status 404 returned error can't find the container with id dad7bfc7474ff406373a412967a7ec0235e1f24dfbbe2bf0540e97676bd412fa Mar 21 05:13:55 crc kubenswrapper[4580]: I0321 05:13:55.309766 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:13:55 crc kubenswrapper[4580]: I0321 05:13:55.320367 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67f74b898d-dtzvd" event={"ID":"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83","Type":"ContainerStarted","Data":"e761964e84b8d509235a99e41a043ee7a8fd819025ff4ef5747650d41ce3c109"} Mar 21 05:13:55 crc kubenswrapper[4580]: I0321 05:13:55.320413 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67f74b898d-dtzvd" event={"ID":"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83","Type":"ContainerStarted","Data":"76f2532bfa9c676b83c5a595e1ab9521ab2613381351e3a2cfd29e077e475f26"} Mar 21 05:13:55 crc kubenswrapper[4580]: I0321 05:13:55.320427 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67f74b898d-dtzvd" event={"ID":"3cbb1901-c5ee-4f46-aa6d-ac31372a9b83","Type":"ContainerStarted","Data":"dad7bfc7474ff406373a412967a7ec0235e1f24dfbbe2bf0540e97676bd412fa"} Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.065738 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.262597 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-config-data-custom\") pod \"83350763-3294-4c10-8bf0-531aec2e110f\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.262644 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-combined-ca-bundle\") pod \"83350763-3294-4c10-8bf0-531aec2e110f\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.262680 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-config-data\") pod \"83350763-3294-4c10-8bf0-531aec2e110f\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.262740 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83350763-3294-4c10-8bf0-531aec2e110f-etc-machine-id\") pod \"83350763-3294-4c10-8bf0-531aec2e110f\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.262895 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grp5n\" (UniqueName: \"kubernetes.io/projected/83350763-3294-4c10-8bf0-531aec2e110f-kube-api-access-grp5n\") pod \"83350763-3294-4c10-8bf0-531aec2e110f\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.262948 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-scripts\") pod \"83350763-3294-4c10-8bf0-531aec2e110f\" (UID: \"83350763-3294-4c10-8bf0-531aec2e110f\") " Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.264855 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83350763-3294-4c10-8bf0-531aec2e110f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "83350763-3294-4c10-8bf0-531aec2e110f" (UID: "83350763-3294-4c10-8bf0-531aec2e110f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.273042 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "83350763-3294-4c10-8bf0-531aec2e110f" (UID: "83350763-3294-4c10-8bf0-531aec2e110f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.282279 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83350763-3294-4c10-8bf0-531aec2e110f-kube-api-access-grp5n" (OuterVolumeSpecName: "kube-api-access-grp5n") pod "83350763-3294-4c10-8bf0-531aec2e110f" (UID: "83350763-3294-4c10-8bf0-531aec2e110f"). InnerVolumeSpecName "kube-api-access-grp5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.288571 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-scripts" (OuterVolumeSpecName: "scripts") pod "83350763-3294-4c10-8bf0-531aec2e110f" (UID: "83350763-3294-4c10-8bf0-531aec2e110f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.339514 4580 generic.go:334] "Generic (PLEG): container finished" podID="83350763-3294-4c10-8bf0-531aec2e110f" containerID="6632c12738d4adfcd4405a097615470fb959b76f6c490c8a0639ce26b685ea70" exitCode=0 Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.340900 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"83350763-3294-4c10-8bf0-531aec2e110f","Type":"ContainerDied","Data":"6632c12738d4adfcd4405a097615470fb959b76f6c490c8a0639ce26b685ea70"} Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.340945 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"83350763-3294-4c10-8bf0-531aec2e110f","Type":"ContainerDied","Data":"59d6ef2c10096824b21dcd8cf16ffd91233bdb04ace7bcca0f5e080ec08e9829"} Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.340967 4580 scope.go:117] "RemoveContainer" containerID="41882dc112c9d9ec4fb4bc5c090d9686f599e7ceb42e70ff547b756470fb9ac3" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.341138 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.341381 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.341449 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.364722 4580 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83350763-3294-4c10-8bf0-531aec2e110f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.365053 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grp5n\" (UniqueName: \"kubernetes.io/projected/83350763-3294-4c10-8bf0-531aec2e110f-kube-api-access-grp5n\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.365065 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.365075 4580 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.414051 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83350763-3294-4c10-8bf0-531aec2e110f" (UID: "83350763-3294-4c10-8bf0-531aec2e110f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.438336 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-67f74b898d-dtzvd" podStartSLOduration=3.43830885 podStartE2EDuration="3.43830885s" podCreationTimestamp="2026-03-21 05:13:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:13:56.399236731 +0000 UTC m=+1341.481820379" watchObservedRunningTime="2026-03-21 05:13:56.43830885 +0000 UTC m=+1341.520892478" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.454051 4580 scope.go:117] "RemoveContainer" containerID="6632c12738d4adfcd4405a097615470fb959b76f6c490c8a0639ce26b685ea70" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.467135 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.479816 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-config-data" (OuterVolumeSpecName: "config-data") pod "83350763-3294-4c10-8bf0-531aec2e110f" (UID: "83350763-3294-4c10-8bf0-531aec2e110f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.484711 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="d8267ff8-754f-446f-a7b1-94462f0f9c93" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.164:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.484859 4580 scope.go:117] "RemoveContainer" containerID="41882dc112c9d9ec4fb4bc5c090d9686f599e7ceb42e70ff547b756470fb9ac3" Mar 21 05:13:56 crc kubenswrapper[4580]: E0321 05:13:56.488433 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41882dc112c9d9ec4fb4bc5c090d9686f599e7ceb42e70ff547b756470fb9ac3\": container with ID starting with 41882dc112c9d9ec4fb4bc5c090d9686f599e7ceb42e70ff547b756470fb9ac3 not found: ID does not exist" containerID="41882dc112c9d9ec4fb4bc5c090d9686f599e7ceb42e70ff547b756470fb9ac3" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.488683 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41882dc112c9d9ec4fb4bc5c090d9686f599e7ceb42e70ff547b756470fb9ac3"} err="failed to get container status \"41882dc112c9d9ec4fb4bc5c090d9686f599e7ceb42e70ff547b756470fb9ac3\": rpc error: code = NotFound desc = could not find container \"41882dc112c9d9ec4fb4bc5c090d9686f599e7ceb42e70ff547b756470fb9ac3\": container with ID starting with 41882dc112c9d9ec4fb4bc5c090d9686f599e7ceb42e70ff547b756470fb9ac3 not found: ID does not exist" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.488824 4580 scope.go:117] "RemoveContainer" containerID="6632c12738d4adfcd4405a097615470fb959b76f6c490c8a0639ce26b685ea70" Mar 21 05:13:56 crc kubenswrapper[4580]: E0321 05:13:56.490922 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6632c12738d4adfcd4405a097615470fb959b76f6c490c8a0639ce26b685ea70\": container with ID starting with 6632c12738d4adfcd4405a097615470fb959b76f6c490c8a0639ce26b685ea70 not found: ID does not exist" containerID="6632c12738d4adfcd4405a097615470fb959b76f6c490c8a0639ce26b685ea70" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.490969 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6632c12738d4adfcd4405a097615470fb959b76f6c490c8a0639ce26b685ea70"} err="failed to get container status \"6632c12738d4adfcd4405a097615470fb959b76f6c490c8a0639ce26b685ea70\": rpc error: code = NotFound desc = could not find container \"6632c12738d4adfcd4405a097615470fb959b76f6c490c8a0639ce26b685ea70\": container with ID starting with 6632c12738d4adfcd4405a097615470fb959b76f6c490c8a0639ce26b685ea70 not found: ID does not exist" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.569042 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83350763-3294-4c10-8bf0-531aec2e110f-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.674897 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.684564 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.712042 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:13:56 crc kubenswrapper[4580]: E0321 05:13:56.712471 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83350763-3294-4c10-8bf0-531aec2e110f" containerName="probe" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.712488 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="83350763-3294-4c10-8bf0-531aec2e110f" containerName="probe" Mar 21 05:13:56 crc kubenswrapper[4580]: E0321 05:13:56.712514 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83350763-3294-4c10-8bf0-531aec2e110f" containerName="cinder-scheduler" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.712522 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="83350763-3294-4c10-8bf0-531aec2e110f" containerName="cinder-scheduler" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.712748 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="83350763-3294-4c10-8bf0-531aec2e110f" containerName="probe" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.712791 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="83350763-3294-4c10-8bf0-531aec2e110f" containerName="cinder-scheduler" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.713911 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.716633 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.727022 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.882660 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6a3d4de-9969-48f3-9f1a-9f273f81050a-config-data\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.882734 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6a3d4de-9969-48f3-9f1a-9f273f81050a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.882770 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6a3d4de-9969-48f3-9f1a-9f273f81050a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.882962 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6a3d4de-9969-48f3-9f1a-9f273f81050a-scripts\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.883075 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvnc5\" (UniqueName: \"kubernetes.io/projected/a6a3d4de-9969-48f3-9f1a-9f273f81050a-kube-api-access-jvnc5\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.883235 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6a3d4de-9969-48f3-9f1a-9f273f81050a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.985412 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6a3d4de-9969-48f3-9f1a-9f273f81050a-config-data\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.985805 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6a3d4de-9969-48f3-9f1a-9f273f81050a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.985839 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6a3d4de-9969-48f3-9f1a-9f273f81050a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.985881 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6a3d4de-9969-48f3-9f1a-9f273f81050a-scripts\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.985937 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvnc5\" (UniqueName: \"kubernetes.io/projected/a6a3d4de-9969-48f3-9f1a-9f273f81050a-kube-api-access-jvnc5\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.986016 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6a3d4de-9969-48f3-9f1a-9f273f81050a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:56 crc kubenswrapper[4580]: I0321 05:13:56.986150 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6a3d4de-9969-48f3-9f1a-9f273f81050a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:57 crc kubenswrapper[4580]: I0321 05:13:57.288221 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6a3d4de-9969-48f3-9f1a-9f273f81050a-config-data\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:57 crc kubenswrapper[4580]: I0321 05:13:57.317343 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6a3d4de-9969-48f3-9f1a-9f273f81050a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:57 crc kubenswrapper[4580]: I0321 05:13:57.318096 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6a3d4de-9969-48f3-9f1a-9f273f81050a-scripts\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:57 crc kubenswrapper[4580]: I0321 05:13:57.318224 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvnc5\" (UniqueName: \"kubernetes.io/projected/a6a3d4de-9969-48f3-9f1a-9f273f81050a-kube-api-access-jvnc5\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:57 crc kubenswrapper[4580]: I0321 05:13:57.318581 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6a3d4de-9969-48f3-9f1a-9f273f81050a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a6a3d4de-9969-48f3-9f1a-9f273f81050a\") " pod="openstack/cinder-scheduler-0" Mar 21 05:13:57 crc kubenswrapper[4580]: I0321 05:13:57.331093 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 05:13:57 crc kubenswrapper[4580]: I0321 05:13:57.631028 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83350763-3294-4c10-8bf0-531aec2e110f" path="/var/lib/kubelet/pods/83350763-3294-4c10-8bf0-531aec2e110f/volumes" Mar 21 05:13:58 crc kubenswrapper[4580]: I0321 05:13:58.218599 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:13:58 crc kubenswrapper[4580]: I0321 05:13:58.368498 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a6a3d4de-9969-48f3-9f1a-9f273f81050a","Type":"ContainerStarted","Data":"7a57664ce5a9e3257251c798901b7fc622ae696e521b778ec8a461735f2977ab"} Mar 21 05:13:59 crc kubenswrapper[4580]: I0321 05:13:59.380904 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a6a3d4de-9969-48f3-9f1a-9f273f81050a","Type":"ContainerStarted","Data":"59267926e8da52f43c86e94d53a167514de7db27c082d6ce44a4a3178f453e8a"} Mar 21 05:13:59 crc kubenswrapper[4580]: I0321 05:13:59.763226 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:59 crc kubenswrapper[4580]: I0321 05:13:59.900327 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6d486fc764-m7r7b" Mar 21 05:13:59 crc kubenswrapper[4580]: I0321 05:13:59.980933 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5bbbf4f7bb-rvhzb"] Mar 21 05:13:59 crc kubenswrapper[4580]: I0321 05:13:59.981200 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" podUID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerName="barbican-api-log" containerID="cri-o://01a88e01d040064cfcb12f3cab4edb81eb398cca95d442bf18ec54af79c671d3" gracePeriod=30 Mar 21 05:13:59 crc kubenswrapper[4580]: I0321 05:13:59.981673 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" podUID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerName="barbican-api" containerID="cri-o://e0c8d817168a9754c07cec9b3d634ccc2dbd8f5fa7d189eee3ad126d5214d84b" gracePeriod=30 Mar 21 05:14:00 crc kubenswrapper[4580]: I0321 05:14:00.181611 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567834-5xjpq"] Mar 21 05:14:00 crc kubenswrapper[4580]: I0321 05:14:00.183177 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-5xjpq" Mar 21 05:14:00 crc kubenswrapper[4580]: I0321 05:14:00.200325 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:14:00 crc kubenswrapper[4580]: I0321 05:14:00.200599 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:14:00 crc kubenswrapper[4580]: I0321 05:14:00.200770 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:14:00 crc kubenswrapper[4580]: I0321 05:14:00.235489 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-5xjpq"] Mar 21 05:14:00 crc kubenswrapper[4580]: I0321 05:14:00.267253 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62gvb\" (UniqueName: \"kubernetes.io/projected/80b82bd4-51bb-4e57-9c87-779dc26bfbf1-kube-api-access-62gvb\") pod \"auto-csr-approver-29567834-5xjpq\" (UID: \"80b82bd4-51bb-4e57-9c87-779dc26bfbf1\") " pod="openshift-infra/auto-csr-approver-29567834-5xjpq" Mar 21 05:14:00 crc kubenswrapper[4580]: I0321 05:14:00.369901 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62gvb\" (UniqueName: \"kubernetes.io/projected/80b82bd4-51bb-4e57-9c87-779dc26bfbf1-kube-api-access-62gvb\") pod \"auto-csr-approver-29567834-5xjpq\" (UID: \"80b82bd4-51bb-4e57-9c87-779dc26bfbf1\") " pod="openshift-infra/auto-csr-approver-29567834-5xjpq" Mar 21 05:14:00 crc kubenswrapper[4580]: I0321 05:14:00.397715 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62gvb\" (UniqueName: \"kubernetes.io/projected/80b82bd4-51bb-4e57-9c87-779dc26bfbf1-kube-api-access-62gvb\") pod \"auto-csr-approver-29567834-5xjpq\" (UID: \"80b82bd4-51bb-4e57-9c87-779dc26bfbf1\") " pod="openshift-infra/auto-csr-approver-29567834-5xjpq" Mar 21 05:14:00 crc kubenswrapper[4580]: I0321 05:14:00.423223 4580 generic.go:334] "Generic (PLEG): container finished" podID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerID="01a88e01d040064cfcb12f3cab4edb81eb398cca95d442bf18ec54af79c671d3" exitCode=143 Mar 21 05:14:00 crc kubenswrapper[4580]: I0321 05:14:00.425750 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" event={"ID":"3ff1f3d4-eada-4182-9a04-48e23f84d11d","Type":"ContainerDied","Data":"01a88e01d040064cfcb12f3cab4edb81eb398cca95d442bf18ec54af79c671d3"} Mar 21 05:14:00 crc kubenswrapper[4580]: I0321 05:14:00.542145 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-5xjpq" Mar 21 05:14:00 crc kubenswrapper[4580]: I0321 05:14:00.659928 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6d45658b5d-dfjj4" Mar 21 05:14:00 crc kubenswrapper[4580]: I0321 05:14:00.911560 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 21 05:14:01 crc kubenswrapper[4580]: W0321 05:14:01.281749 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80b82bd4_51bb_4e57_9c87_779dc26bfbf1.slice/crio-00c99541059f023109dcbc1d89756f81275e8b14fa7f54467c341c7304774c72 WatchSource:0}: Error finding container 00c99541059f023109dcbc1d89756f81275e8b14fa7f54467c341c7304774c72: Status 404 returned error can't find the container with id 00c99541059f023109dcbc1d89756f81275e8b14fa7f54467c341c7304774c72 Mar 21 05:14:01 crc kubenswrapper[4580]: I0321 05:14:01.305879 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-5xjpq"] Mar 21 05:14:01 crc kubenswrapper[4580]: I0321 05:14:01.437036 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a6a3d4de-9969-48f3-9f1a-9f273f81050a","Type":"ContainerStarted","Data":"eb5d2dfd0a855fd6493eb501435f98b0a22cdf9ba47e1f1f0c3cb123d53590ec"} Mar 21 05:14:01 crc kubenswrapper[4580]: I0321 05:14:01.438976 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567834-5xjpq" event={"ID":"80b82bd4-51bb-4e57-9c87-779dc26bfbf1","Type":"ContainerStarted","Data":"00c99541059f023109dcbc1d89756f81275e8b14fa7f54467c341c7304774c72"} Mar 21 05:14:02 crc kubenswrapper[4580]: I0321 05:14:02.332044 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 21 05:14:03 crc kubenswrapper[4580]: I0321 05:14:03.472368 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567834-5xjpq" event={"ID":"80b82bd4-51bb-4e57-9c87-779dc26bfbf1","Type":"ContainerStarted","Data":"d08bb178893463b29248137ba30bca92d626c1f48e945eba88f9d5edd1d723b9"} Mar 21 05:14:03 crc kubenswrapper[4580]: I0321 05:14:03.499076 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.499055042 podStartE2EDuration="7.499055042s" podCreationTimestamp="2026-03-21 05:13:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:14:01.467413082 +0000 UTC m=+1346.549996730" watchObservedRunningTime="2026-03-21 05:14:03.499055042 +0000 UTC m=+1348.581638670" Mar 21 05:14:03 crc kubenswrapper[4580]: I0321 05:14:03.499361 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567834-5xjpq" podStartSLOduration=2.373286468 podStartE2EDuration="3.49935515s" podCreationTimestamp="2026-03-21 05:14:00 +0000 UTC" firstStartedPulling="2026-03-21 05:14:01.284237821 +0000 UTC m=+1346.366821449" lastFinishedPulling="2026-03-21 05:14:02.410306493 +0000 UTC m=+1347.492890131" observedRunningTime="2026-03-21 05:14:03.494274165 +0000 UTC m=+1348.576857803" watchObservedRunningTime="2026-03-21 05:14:03.49935515 +0000 UTC m=+1348.581938778" Mar 21 05:14:03 crc kubenswrapper[4580]: I0321 05:14:03.574529 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" podUID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:59142->10.217.0.162:9311: read: connection reset by peer" Mar 21 05:14:03 crc kubenswrapper[4580]: I0321 05:14:03.574579 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" podUID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:59134->10.217.0.162:9311: read: connection reset by peer" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.091947 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.162137 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-config-data\") pod \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.162210 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-combined-ca-bundle\") pod \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.162429 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-config-data-custom\") pod \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.162478 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpfks\" (UniqueName: \"kubernetes.io/projected/3ff1f3d4-eada-4182-9a04-48e23f84d11d-kube-api-access-qpfks\") pod \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.162510 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ff1f3d4-eada-4182-9a04-48e23f84d11d-logs\") pod \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\" (UID: \"3ff1f3d4-eada-4182-9a04-48e23f84d11d\") " Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.164574 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ff1f3d4-eada-4182-9a04-48e23f84d11d-logs" (OuterVolumeSpecName: "logs") pod "3ff1f3d4-eada-4182-9a04-48e23f84d11d" (UID: "3ff1f3d4-eada-4182-9a04-48e23f84d11d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.204153 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ff1f3d4-eada-4182-9a04-48e23f84d11d-kube-api-access-qpfks" (OuterVolumeSpecName: "kube-api-access-qpfks") pod "3ff1f3d4-eada-4182-9a04-48e23f84d11d" (UID: "3ff1f3d4-eada-4182-9a04-48e23f84d11d"). InnerVolumeSpecName "kube-api-access-qpfks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.207099 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3ff1f3d4-eada-4182-9a04-48e23f84d11d" (UID: "3ff1f3d4-eada-4182-9a04-48e23f84d11d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.247064 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-config-data" (OuterVolumeSpecName: "config-data") pod "3ff1f3d4-eada-4182-9a04-48e23f84d11d" (UID: "3ff1f3d4-eada-4182-9a04-48e23f84d11d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.264554 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ff1f3d4-eada-4182-9a04-48e23f84d11d" (UID: "3ff1f3d4-eada-4182-9a04-48e23f84d11d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.270054 4580 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.270087 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpfks\" (UniqueName: \"kubernetes.io/projected/3ff1f3d4-eada-4182-9a04-48e23f84d11d-kube-api-access-qpfks\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.270118 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ff1f3d4-eada-4182-9a04-48e23f84d11d-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.270129 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.270138 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff1f3d4-eada-4182-9a04-48e23f84d11d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.492901 4580 generic.go:334] "Generic (PLEG): container finished" podID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerID="e0c8d817168a9754c07cec9b3d634ccc2dbd8f5fa7d189eee3ad126d5214d84b" exitCode=0 Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.493955 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.494150 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" event={"ID":"3ff1f3d4-eada-4182-9a04-48e23f84d11d","Type":"ContainerDied","Data":"e0c8d817168a9754c07cec9b3d634ccc2dbd8f5fa7d189eee3ad126d5214d84b"} Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.494192 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbbf4f7bb-rvhzb" event={"ID":"3ff1f3d4-eada-4182-9a04-48e23f84d11d","Type":"ContainerDied","Data":"73aba58797f134b331641284137d4d69b627c6ddb3a62152104aca66c651ef6b"} Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.494216 4580 scope.go:117] "RemoveContainer" containerID="e0c8d817168a9754c07cec9b3d634ccc2dbd8f5fa7d189eee3ad126d5214d84b" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.533964 4580 scope.go:117] "RemoveContainer" containerID="01a88e01d040064cfcb12f3cab4edb81eb398cca95d442bf18ec54af79c671d3" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.540852 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5bbbf4f7bb-rvhzb"] Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.556534 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5bbbf4f7bb-rvhzb"] Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.580868 4580 scope.go:117] "RemoveContainer" containerID="e0c8d817168a9754c07cec9b3d634ccc2dbd8f5fa7d189eee3ad126d5214d84b" Mar 21 05:14:04 crc kubenswrapper[4580]: E0321 05:14:04.581289 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0c8d817168a9754c07cec9b3d634ccc2dbd8f5fa7d189eee3ad126d5214d84b\": container with ID starting with e0c8d817168a9754c07cec9b3d634ccc2dbd8f5fa7d189eee3ad126d5214d84b not found: ID does not exist" containerID="e0c8d817168a9754c07cec9b3d634ccc2dbd8f5fa7d189eee3ad126d5214d84b" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.581338 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0c8d817168a9754c07cec9b3d634ccc2dbd8f5fa7d189eee3ad126d5214d84b"} err="failed to get container status \"e0c8d817168a9754c07cec9b3d634ccc2dbd8f5fa7d189eee3ad126d5214d84b\": rpc error: code = NotFound desc = could not find container \"e0c8d817168a9754c07cec9b3d634ccc2dbd8f5fa7d189eee3ad126d5214d84b\": container with ID starting with e0c8d817168a9754c07cec9b3d634ccc2dbd8f5fa7d189eee3ad126d5214d84b not found: ID does not exist" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.581381 4580 scope.go:117] "RemoveContainer" containerID="01a88e01d040064cfcb12f3cab4edb81eb398cca95d442bf18ec54af79c671d3" Mar 21 05:14:04 crc kubenswrapper[4580]: E0321 05:14:04.581628 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01a88e01d040064cfcb12f3cab4edb81eb398cca95d442bf18ec54af79c671d3\": container with ID starting with 01a88e01d040064cfcb12f3cab4edb81eb398cca95d442bf18ec54af79c671d3 not found: ID does not exist" containerID="01a88e01d040064cfcb12f3cab4edb81eb398cca95d442bf18ec54af79c671d3" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.581650 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01a88e01d040064cfcb12f3cab4edb81eb398cca95d442bf18ec54af79c671d3"} err="failed to get container status \"01a88e01d040064cfcb12f3cab4edb81eb398cca95d442bf18ec54af79c671d3\": rpc error: code = NotFound desc = could not find container \"01a88e01d040064cfcb12f3cab4edb81eb398cca95d442bf18ec54af79c671d3\": container with ID starting with 01a88e01d040064cfcb12f3cab4edb81eb398cca95d442bf18ec54af79c671d3 not found: ID does not exist" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.641302 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 21 05:14:04 crc kubenswrapper[4580]: E0321 05:14:04.641955 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerName="barbican-api-log" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.642070 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerName="barbican-api-log" Mar 21 05:14:04 crc kubenswrapper[4580]: E0321 05:14:04.642197 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerName="barbican-api" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.642252 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerName="barbican-api" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.642521 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerName="barbican-api-log" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.642594 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" containerName="barbican-api" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.643294 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.649664 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-k56vc" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.649948 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.650655 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.662537 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.788271 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/286ff68a-a9d7-4592-9146-f9537c8cf329-openstack-config-secret\") pod \"openstackclient\" (UID: \"286ff68a-a9d7-4592-9146-f9537c8cf329\") " pod="openstack/openstackclient" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.788375 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/286ff68a-a9d7-4592-9146-f9537c8cf329-combined-ca-bundle\") pod \"openstackclient\" (UID: \"286ff68a-a9d7-4592-9146-f9537c8cf329\") " pod="openstack/openstackclient" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.788414 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/286ff68a-a9d7-4592-9146-f9537c8cf329-openstack-config\") pod \"openstackclient\" (UID: \"286ff68a-a9d7-4592-9146-f9537c8cf329\") " pod="openstack/openstackclient" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.788433 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkrw8\" (UniqueName: \"kubernetes.io/projected/286ff68a-a9d7-4592-9146-f9537c8cf329-kube-api-access-gkrw8\") pod \"openstackclient\" (UID: \"286ff68a-a9d7-4592-9146-f9537c8cf329\") " pod="openstack/openstackclient" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.889694 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/286ff68a-a9d7-4592-9146-f9537c8cf329-combined-ca-bundle\") pod \"openstackclient\" (UID: \"286ff68a-a9d7-4592-9146-f9537c8cf329\") " pod="openstack/openstackclient" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.889757 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/286ff68a-a9d7-4592-9146-f9537c8cf329-openstack-config\") pod \"openstackclient\" (UID: \"286ff68a-a9d7-4592-9146-f9537c8cf329\") " pod="openstack/openstackclient" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.889823 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkrw8\" (UniqueName: \"kubernetes.io/projected/286ff68a-a9d7-4592-9146-f9537c8cf329-kube-api-access-gkrw8\") pod \"openstackclient\" (UID: \"286ff68a-a9d7-4592-9146-f9537c8cf329\") " pod="openstack/openstackclient" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.889897 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/286ff68a-a9d7-4592-9146-f9537c8cf329-openstack-config-secret\") pod \"openstackclient\" (UID: \"286ff68a-a9d7-4592-9146-f9537c8cf329\") " pod="openstack/openstackclient" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.890621 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/286ff68a-a9d7-4592-9146-f9537c8cf329-openstack-config\") pod \"openstackclient\" (UID: \"286ff68a-a9d7-4592-9146-f9537c8cf329\") " pod="openstack/openstackclient" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.895100 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/286ff68a-a9d7-4592-9146-f9537c8cf329-openstack-config-secret\") pod \"openstackclient\" (UID: \"286ff68a-a9d7-4592-9146-f9537c8cf329\") " pod="openstack/openstackclient" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.908387 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/286ff68a-a9d7-4592-9146-f9537c8cf329-combined-ca-bundle\") pod \"openstackclient\" (UID: \"286ff68a-a9d7-4592-9146-f9537c8cf329\") " pod="openstack/openstackclient" Mar 21 05:14:04 crc kubenswrapper[4580]: I0321 05:14:04.924654 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkrw8\" (UniqueName: \"kubernetes.io/projected/286ff68a-a9d7-4592-9146-f9537c8cf329-kube-api-access-gkrw8\") pod \"openstackclient\" (UID: \"286ff68a-a9d7-4592-9146-f9537c8cf329\") " pod="openstack/openstackclient" Mar 21 05:14:05 crc kubenswrapper[4580]: I0321 05:14:05.017300 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 05:14:05 crc kubenswrapper[4580]: W0321 05:14:05.576340 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod286ff68a_a9d7_4592_9146_f9537c8cf329.slice/crio-79c0d5336818e88ca5a6754743643728c963ad13b58ab9835042d46046b7a8ba WatchSource:0}: Error finding container 79c0d5336818e88ca5a6754743643728c963ad13b58ab9835042d46046b7a8ba: Status 404 returned error can't find the container with id 79c0d5336818e88ca5a6754743643728c963ad13b58ab9835042d46046b7a8ba Mar 21 05:14:05 crc kubenswrapper[4580]: I0321 05:14:05.577302 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 21 05:14:05 crc kubenswrapper[4580]: I0321 05:14:05.631560 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ff1f3d4-eada-4182-9a04-48e23f84d11d" path="/var/lib/kubelet/pods/3ff1f3d4-eada-4182-9a04-48e23f84d11d/volumes" Mar 21 05:14:06 crc kubenswrapper[4580]: I0321 05:14:06.526206 4580 generic.go:334] "Generic (PLEG): container finished" podID="80b82bd4-51bb-4e57-9c87-779dc26bfbf1" containerID="d08bb178893463b29248137ba30bca92d626c1f48e945eba88f9d5edd1d723b9" exitCode=0 Mar 21 05:14:06 crc kubenswrapper[4580]: I0321 05:14:06.526716 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567834-5xjpq" event={"ID":"80b82bd4-51bb-4e57-9c87-779dc26bfbf1","Type":"ContainerDied","Data":"d08bb178893463b29248137ba30bca92d626c1f48e945eba88f9d5edd1d723b9"} Mar 21 05:14:06 crc kubenswrapper[4580]: I0321 05:14:06.533123 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"286ff68a-a9d7-4592-9146-f9537c8cf329","Type":"ContainerStarted","Data":"79c0d5336818e88ca5a6754743643728c963ad13b58ab9835042d46046b7a8ba"} Mar 21 05:14:06 crc kubenswrapper[4580]: I0321 05:14:06.654174 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:14:07 crc kubenswrapper[4580]: I0321 05:14:07.753163 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 21 05:14:08 crc kubenswrapper[4580]: I0321 05:14:08.142730 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-5xjpq" Mar 21 05:14:08 crc kubenswrapper[4580]: I0321 05:14:08.261428 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62gvb\" (UniqueName: \"kubernetes.io/projected/80b82bd4-51bb-4e57-9c87-779dc26bfbf1-kube-api-access-62gvb\") pod \"80b82bd4-51bb-4e57-9c87-779dc26bfbf1\" (UID: \"80b82bd4-51bb-4e57-9c87-779dc26bfbf1\") " Mar 21 05:14:08 crc kubenswrapper[4580]: I0321 05:14:08.277482 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80b82bd4-51bb-4e57-9c87-779dc26bfbf1-kube-api-access-62gvb" (OuterVolumeSpecName: "kube-api-access-62gvb") pod "80b82bd4-51bb-4e57-9c87-779dc26bfbf1" (UID: "80b82bd4-51bb-4e57-9c87-779dc26bfbf1"). InnerVolumeSpecName "kube-api-access-62gvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:08 crc kubenswrapper[4580]: I0321 05:14:08.364139 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62gvb\" (UniqueName: \"kubernetes.io/projected/80b82bd4-51bb-4e57-9c87-779dc26bfbf1-kube-api-access-62gvb\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:08 crc kubenswrapper[4580]: I0321 05:14:08.559858 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567834-5xjpq" event={"ID":"80b82bd4-51bb-4e57-9c87-779dc26bfbf1","Type":"ContainerDied","Data":"00c99541059f023109dcbc1d89756f81275e8b14fa7f54467c341c7304774c72"} Mar 21 05:14:08 crc kubenswrapper[4580]: I0321 05:14:08.559910 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00c99541059f023109dcbc1d89756f81275e8b14fa7f54467c341c7304774c72" Mar 21 05:14:08 crc kubenswrapper[4580]: I0321 05:14:08.559962 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-5xjpq" Mar 21 05:14:08 crc kubenswrapper[4580]: I0321 05:14:08.670389 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-rqw6s"] Mar 21 05:14:08 crc kubenswrapper[4580]: I0321 05:14:08.683116 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-rqw6s"] Mar 21 05:14:09 crc kubenswrapper[4580]: I0321 05:14:09.632523 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8604f80a-d308-4e06-8d72-3281dfdc4a6a" path="/var/lib/kubelet/pods/8604f80a-d308-4e06-8d72-3281dfdc4a6a/volumes" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.737166 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6dbb667f95-c5g4x"] Mar 21 05:14:10 crc kubenswrapper[4580]: E0321 05:14:10.737644 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b82bd4-51bb-4e57-9c87-779dc26bfbf1" containerName="oc" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.737660 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b82bd4-51bb-4e57-9c87-779dc26bfbf1" containerName="oc" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.737889 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="80b82bd4-51bb-4e57-9c87-779dc26bfbf1" containerName="oc" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.738969 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.743035 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.743226 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.743223 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6dbb667f95-c5g4x"] Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.743956 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.832421 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21065819-f94d-4cc9-925f-c4be4eeee0d7-log-httpd\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.832813 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21065819-f94d-4cc9-925f-c4be4eeee0d7-internal-tls-certs\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.832883 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21065819-f94d-4cc9-925f-c4be4eeee0d7-etc-swift\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.832936 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21065819-f94d-4cc9-925f-c4be4eeee0d7-combined-ca-bundle\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.832960 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkcpg\" (UniqueName: \"kubernetes.io/projected/21065819-f94d-4cc9-925f-c4be4eeee0d7-kube-api-access-jkcpg\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.833145 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21065819-f94d-4cc9-925f-c4be4eeee0d7-config-data\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.833245 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21065819-f94d-4cc9-925f-c4be4eeee0d7-public-tls-certs\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.833410 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21065819-f94d-4cc9-925f-c4be4eeee0d7-run-httpd\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.935280 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21065819-f94d-4cc9-925f-c4be4eeee0d7-combined-ca-bundle\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.935327 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkcpg\" (UniqueName: \"kubernetes.io/projected/21065819-f94d-4cc9-925f-c4be4eeee0d7-kube-api-access-jkcpg\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.935363 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21065819-f94d-4cc9-925f-c4be4eeee0d7-config-data\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.935387 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21065819-f94d-4cc9-925f-c4be4eeee0d7-public-tls-certs\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.935425 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21065819-f94d-4cc9-925f-c4be4eeee0d7-run-httpd\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.935464 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21065819-f94d-4cc9-925f-c4be4eeee0d7-log-httpd\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.935495 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21065819-f94d-4cc9-925f-c4be4eeee0d7-internal-tls-certs\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.935662 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21065819-f94d-4cc9-925f-c4be4eeee0d7-etc-swift\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.936162 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21065819-f94d-4cc9-925f-c4be4eeee0d7-log-httpd\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.936491 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21065819-f94d-4cc9-925f-c4be4eeee0d7-run-httpd\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.947008 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21065819-f94d-4cc9-925f-c4be4eeee0d7-public-tls-certs\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.947762 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21065819-f94d-4cc9-925f-c4be4eeee0d7-combined-ca-bundle\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.956039 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21065819-f94d-4cc9-925f-c4be4eeee0d7-internal-tls-certs\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.957335 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21065819-f94d-4cc9-925f-c4be4eeee0d7-etc-swift\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.960324 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkcpg\" (UniqueName: \"kubernetes.io/projected/21065819-f94d-4cc9-925f-c4be4eeee0d7-kube-api-access-jkcpg\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:10 crc kubenswrapper[4580]: I0321 05:14:10.971223 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21065819-f94d-4cc9-925f-c4be4eeee0d7-config-data\") pod \"swift-proxy-6dbb667f95-c5g4x\" (UID: \"21065819-f94d-4cc9-925f-c4be4eeee0d7\") " pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:11 crc kubenswrapper[4580]: I0321 05:14:11.081987 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:12 crc kubenswrapper[4580]: I0321 05:14:12.307367 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6cd755485-pmnqc" Mar 21 05:14:12 crc kubenswrapper[4580]: I0321 05:14:12.390103 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57d468d6b8-4xwvd"] Mar 21 05:14:12 crc kubenswrapper[4580]: I0321 05:14:12.390329 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57d468d6b8-4xwvd" podUID="33984e31-23ff-4d28-9828-74d12b7fc0a7" containerName="neutron-api" containerID="cri-o://d13a268270d73439428136cc30f637118603cadce1019c8dd25bc9f0ac7c91d1" gracePeriod=30 Mar 21 05:14:12 crc kubenswrapper[4580]: I0321 05:14:12.390564 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57d468d6b8-4xwvd" podUID="33984e31-23ff-4d28-9828-74d12b7fc0a7" containerName="neutron-httpd" containerID="cri-o://d79d9b98b19c66dc420061e3d3ea414cafbef44bb76a885281ddb24f30144e6c" gracePeriod=30 Mar 21 05:14:12 crc kubenswrapper[4580]: I0321 05:14:12.567232 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6dbb667f95-c5g4x"] Mar 21 05:14:12 crc kubenswrapper[4580]: I0321 05:14:12.621132 4580 generic.go:334] "Generic (PLEG): container finished" podID="33984e31-23ff-4d28-9828-74d12b7fc0a7" containerID="d79d9b98b19c66dc420061e3d3ea414cafbef44bb76a885281ddb24f30144e6c" exitCode=0 Mar 21 05:14:12 crc kubenswrapper[4580]: I0321 05:14:12.621246 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57d468d6b8-4xwvd" event={"ID":"33984e31-23ff-4d28-9828-74d12b7fc0a7","Type":"ContainerDied","Data":"d79d9b98b19c66dc420061e3d3ea414cafbef44bb76a885281ddb24f30144e6c"} Mar 21 05:14:12 crc kubenswrapper[4580]: I0321 05:14:12.637236 4580 generic.go:334] "Generic (PLEG): container finished" podID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerID="b03c7b6b3d34260bff0a00bc798a52da6836ea0ee76b7c6df6980b8c29af49eb" exitCode=137 Mar 21 05:14:12 crc kubenswrapper[4580]: I0321 05:14:12.637289 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67655f8b6-mbx6n" event={"ID":"a03ce0fa-f7e8-4b48-bbea-95807f14dd26","Type":"ContainerDied","Data":"b03c7b6b3d34260bff0a00bc798a52da6836ea0ee76b7c6df6980b8c29af49eb"} Mar 21 05:14:13 crc kubenswrapper[4580]: I0321 05:14:13.436842 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 21 05:14:13 crc kubenswrapper[4580]: I0321 05:14:13.659183 4580 generic.go:334] "Generic (PLEG): container finished" podID="08a0110f-428a-481d-b439-bc16e6837dc3" containerID="19fb47284615f4db4d3ee3b8a1bb2963d50724cdbe63d92f0b19442506b6bf5b" exitCode=137 Mar 21 05:14:13 crc kubenswrapper[4580]: I0321 05:14:13.659225 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587cfc8688-265kc" event={"ID":"08a0110f-428a-481d-b439-bc16e6837dc3","Type":"ContainerDied","Data":"19fb47284615f4db4d3ee3b8a1bb2963d50724cdbe63d92f0b19442506b6bf5b"} Mar 21 05:14:13 crc kubenswrapper[4580]: I0321 05:14:13.720512 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:14:13 crc kubenswrapper[4580]: I0321 05:14:13.720963 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="sg-core" containerID="cri-o://db67d4321b5cd6dafcdc1ed5a2a09ad8d3610f33d42492f866565584c063d3d2" gracePeriod=30 Mar 21 05:14:13 crc kubenswrapper[4580]: I0321 05:14:13.720971 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="proxy-httpd" containerID="cri-o://b83fd3821ac6334371330cc325afa955d8d19b6906ad78933d2d62d6b4ec28eb" gracePeriod=30 Mar 21 05:14:13 crc kubenswrapper[4580]: I0321 05:14:13.721004 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="ceilometer-notification-agent" containerID="cri-o://1e32fbc06eb17d4fbb9d9fd6f9781c6231cf73fa6cc7275bb60b3fbd2ba6c4bf" gracePeriod=30 Mar 21 05:14:13 crc kubenswrapper[4580]: I0321 05:14:13.720924 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="ceilometer-central-agent" containerID="cri-o://13cdf4bc25a2a4ff4f44203ac90f53c94b3c3f7df282de0ab9155d6b47bc2473" gracePeriod=30 Mar 21 05:14:14 crc kubenswrapper[4580]: I0321 05:14:14.672726 4580 generic.go:334] "Generic (PLEG): container finished" podID="0694bb24-df06-41c3-a24d-8428090b6df4" containerID="b83fd3821ac6334371330cc325afa955d8d19b6906ad78933d2d62d6b4ec28eb" exitCode=0 Mar 21 05:14:14 crc kubenswrapper[4580]: I0321 05:14:14.672757 4580 generic.go:334] "Generic (PLEG): container finished" podID="0694bb24-df06-41c3-a24d-8428090b6df4" containerID="db67d4321b5cd6dafcdc1ed5a2a09ad8d3610f33d42492f866565584c063d3d2" exitCode=2 Mar 21 05:14:14 crc kubenswrapper[4580]: I0321 05:14:14.672766 4580 generic.go:334] "Generic (PLEG): container finished" podID="0694bb24-df06-41c3-a24d-8428090b6df4" containerID="13cdf4bc25a2a4ff4f44203ac90f53c94b3c3f7df282de0ab9155d6b47bc2473" exitCode=0 Mar 21 05:14:14 crc kubenswrapper[4580]: I0321 05:14:14.672801 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0694bb24-df06-41c3-a24d-8428090b6df4","Type":"ContainerDied","Data":"b83fd3821ac6334371330cc325afa955d8d19b6906ad78933d2d62d6b4ec28eb"} Mar 21 05:14:14 crc kubenswrapper[4580]: I0321 05:14:14.672827 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0694bb24-df06-41c3-a24d-8428090b6df4","Type":"ContainerDied","Data":"db67d4321b5cd6dafcdc1ed5a2a09ad8d3610f33d42492f866565584c063d3d2"} Mar 21 05:14:14 crc kubenswrapper[4580]: I0321 05:14:14.672837 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0694bb24-df06-41c3-a24d-8428090b6df4","Type":"ContainerDied","Data":"13cdf4bc25a2a4ff4f44203ac90f53c94b3c3f7df282de0ab9155d6b47bc2473"} Mar 21 05:14:15 crc kubenswrapper[4580]: I0321 05:14:15.685195 4580 generic.go:334] "Generic (PLEG): container finished" podID="d8267ff8-754f-446f-a7b1-94462f0f9c93" containerID="c48256f4ad3c92fc98688db4c7fc3dbc3507966ff1f6f93b840cfa6e24de8913" exitCode=137 Mar 21 05:14:15 crc kubenswrapper[4580]: I0321 05:14:15.685259 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d8267ff8-754f-446f-a7b1-94462f0f9c93","Type":"ContainerDied","Data":"c48256f4ad3c92fc98688db4c7fc3dbc3507966ff1f6f93b840cfa6e24de8913"} Mar 21 05:14:15 crc kubenswrapper[4580]: I0321 05:14:15.688828 4580 generic.go:334] "Generic (PLEG): container finished" podID="33984e31-23ff-4d28-9828-74d12b7fc0a7" containerID="d13a268270d73439428136cc30f637118603cadce1019c8dd25bc9f0ac7c91d1" exitCode=0 Mar 21 05:14:15 crc kubenswrapper[4580]: I0321 05:14:15.688860 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57d468d6b8-4xwvd" event={"ID":"33984e31-23ff-4d28-9828-74d12b7fc0a7","Type":"ContainerDied","Data":"d13a268270d73439428136cc30f637118603cadce1019c8dd25bc9f0ac7c91d1"} Mar 21 05:14:15 crc kubenswrapper[4580]: I0321 05:14:15.947665 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:14:15 crc kubenswrapper[4580]: I0321 05:14:15.947735 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:14:16 crc kubenswrapper[4580]: I0321 05:14:16.465606 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="d8267ff8-754f-446f-a7b1-94462f0f9c93" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.164:8776/healthcheck\": dial tcp 10.217.0.164:8776: connect: connection refused" Mar 21 05:14:17 crc kubenswrapper[4580]: I0321 05:14:17.728775 4580 generic.go:334] "Generic (PLEG): container finished" podID="0694bb24-df06-41c3-a24d-8428090b6df4" containerID="1e32fbc06eb17d4fbb9d9fd6f9781c6231cf73fa6cc7275bb60b3fbd2ba6c4bf" exitCode=0 Mar 21 05:14:17 crc kubenswrapper[4580]: I0321 05:14:17.729126 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0694bb24-df06-41c3-a24d-8428090b6df4","Type":"ContainerDied","Data":"1e32fbc06eb17d4fbb9d9fd6f9781c6231cf73fa6cc7275bb60b3fbd2ba6c4bf"} Mar 21 05:14:18 crc kubenswrapper[4580]: I0321 05:14:18.638510 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:14:18 crc kubenswrapper[4580]: I0321 05:14:18.639680 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="caa7b0b4-ac59-4338-896e-723db48b3d24" containerName="kube-state-metrics" containerID="cri-o://6022b48e0c0eb828b77506fcdd071acca3447e151624bfca742cdcf31c26cd9b" gracePeriod=30 Mar 21 05:14:19 crc kubenswrapper[4580]: I0321 05:14:19.768372 4580 generic.go:334] "Generic (PLEG): container finished" podID="caa7b0b4-ac59-4338-896e-723db48b3d24" containerID="6022b48e0c0eb828b77506fcdd071acca3447e151624bfca742cdcf31c26cd9b" exitCode=2 Mar 21 05:14:19 crc kubenswrapper[4580]: I0321 05:14:19.768417 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"caa7b0b4-ac59-4338-896e-723db48b3d24","Type":"ContainerDied","Data":"6022b48e0c0eb828b77506fcdd071acca3447e151624bfca742cdcf31c26cd9b"} Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.479285 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.772208 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8267ff8-754f-446f-a7b1-94462f0f9c93-logs\") pod \"d8267ff8-754f-446f-a7b1-94462f0f9c93\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.772289 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrvxl\" (UniqueName: \"kubernetes.io/projected/d8267ff8-754f-446f-a7b1-94462f0f9c93-kube-api-access-xrvxl\") pod \"d8267ff8-754f-446f-a7b1-94462f0f9c93\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.772387 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8267ff8-754f-446f-a7b1-94462f0f9c93-etc-machine-id\") pod \"d8267ff8-754f-446f-a7b1-94462f0f9c93\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.772496 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-combined-ca-bundle\") pod \"d8267ff8-754f-446f-a7b1-94462f0f9c93\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.772551 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-config-data-custom\") pod \"d8267ff8-754f-446f-a7b1-94462f0f9c93\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.772622 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-config-data\") pod \"d8267ff8-754f-446f-a7b1-94462f0f9c93\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.772640 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-scripts\") pod \"d8267ff8-754f-446f-a7b1-94462f0f9c93\" (UID: \"d8267ff8-754f-446f-a7b1-94462f0f9c93\") " Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.774382 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8267ff8-754f-446f-a7b1-94462f0f9c93-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d8267ff8-754f-446f-a7b1-94462f0f9c93" (UID: "d8267ff8-754f-446f-a7b1-94462f0f9c93"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.779044 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-scripts" (OuterVolumeSpecName: "scripts") pod "d8267ff8-754f-446f-a7b1-94462f0f9c93" (UID: "d8267ff8-754f-446f-a7b1-94462f0f9c93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.780947 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d8267ff8-754f-446f-a7b1-94462f0f9c93" (UID: "d8267ff8-754f-446f-a7b1-94462f0f9c93"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.784520 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8267ff8-754f-446f-a7b1-94462f0f9c93-logs" (OuterVolumeSpecName: "logs") pod "d8267ff8-754f-446f-a7b1-94462f0f9c93" (UID: "d8267ff8-754f-446f-a7b1-94462f0f9c93"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.816292 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8267ff8-754f-446f-a7b1-94462f0f9c93-kube-api-access-xrvxl" (OuterVolumeSpecName: "kube-api-access-xrvxl") pod "d8267ff8-754f-446f-a7b1-94462f0f9c93" (UID: "d8267ff8-754f-446f-a7b1-94462f0f9c93"). InnerVolumeSpecName "kube-api-access-xrvxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.877972 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8267ff8-754f-446f-a7b1-94462f0f9c93" (UID: "d8267ff8-754f-446f-a7b1-94462f0f9c93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.892214 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8267ff8-754f-446f-a7b1-94462f0f9c93-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.892253 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrvxl\" (UniqueName: \"kubernetes.io/projected/d8267ff8-754f-446f-a7b1-94462f0f9c93-kube-api-access-xrvxl\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.892266 4580 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8267ff8-754f-446f-a7b1-94462f0f9c93-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.892276 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.892284 4580 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.892293 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.904176 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"caa7b0b4-ac59-4338-896e-723db48b3d24","Type":"ContainerDied","Data":"15c1b6ce88564a1603e9486fd67ed1d7e575fcf7ea2241910d51a6113a686dee"} Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.904219 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15c1b6ce88564a1603e9486fd67ed1d7e575fcf7ea2241910d51a6113a686dee" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.919062 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.930650 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587cfc8688-265kc" event={"ID":"08a0110f-428a-481d-b439-bc16e6837dc3","Type":"ContainerStarted","Data":"f8ab4ef90bd31d20c6033eb943d9a0a9a88a0d10339df1ff4a08e1b1232fe783"} Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.972236 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d8267ff8-754f-446f-a7b1-94462f0f9c93","Type":"ContainerDied","Data":"e738bd822cecca4278152455525905e466acaf73a6a97414c28eca960b5f3615"} Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.972316 4580 scope.go:117] "RemoveContainer" containerID="c48256f4ad3c92fc98688db4c7fc3dbc3507966ff1f6f93b840cfa6e24de8913" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.972538 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.993574 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7fn9\" (UniqueName: \"kubernetes.io/projected/caa7b0b4-ac59-4338-896e-723db48b3d24-kube-api-access-t7fn9\") pod \"caa7b0b4-ac59-4338-896e-723db48b3d24\" (UID: \"caa7b0b4-ac59-4338-896e-723db48b3d24\") " Mar 21 05:14:20 crc kubenswrapper[4580]: I0321 05:14:20.996059 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67655f8b6-mbx6n" event={"ID":"a03ce0fa-f7e8-4b48-bbea-95807f14dd26","Type":"ContainerStarted","Data":"b1910d7dc39d75c560d1ecb55908d0c4f510cbbee17323265da8706ab45dadba"} Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.002963 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caa7b0b4-ac59-4338-896e-723db48b3d24-kube-api-access-t7fn9" (OuterVolumeSpecName: "kube-api-access-t7fn9") pod "caa7b0b4-ac59-4338-896e-723db48b3d24" (UID: "caa7b0b4-ac59-4338-896e-723db48b3d24"). InnerVolumeSpecName "kube-api-access-t7fn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.030576 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-config-data" (OuterVolumeSpecName: "config-data") pod "d8267ff8-754f-446f-a7b1-94462f0f9c93" (UID: "d8267ff8-754f-446f-a7b1-94462f0f9c93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.056832 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"286ff68a-a9d7-4592-9146-f9537c8cf329","Type":"ContainerStarted","Data":"63eb1d5926530efb6895d6f5ef9f6af894e70526e2a0e4944a66842820801616"} Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.075965 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6dbb667f95-c5g4x" event={"ID":"21065819-f94d-4cc9-925f-c4be4eeee0d7","Type":"ContainerStarted","Data":"3af643a11e4ebc6d0a47275b86dd7d64c0fb385aee2cbfec04503aeb700d4eda"} Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.076013 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6dbb667f95-c5g4x" event={"ID":"21065819-f94d-4cc9-925f-c4be4eeee0d7","Type":"ContainerStarted","Data":"554c686165239e4271fd2f9d0666a48d25968d4b11562e9e53b8293edb00cb24"} Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.101269 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8267ff8-754f-446f-a7b1-94462f0f9c93-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.101321 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7fn9\" (UniqueName: \"kubernetes.io/projected/caa7b0b4-ac59-4338-896e-723db48b3d24-kube-api-access-t7fn9\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.147124 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.4893765070000002 podStartE2EDuration="17.147097353s" podCreationTimestamp="2026-03-21 05:14:04 +0000 UTC" firstStartedPulling="2026-03-21 05:14:05.579744207 +0000 UTC m=+1350.662327835" lastFinishedPulling="2026-03-21 05:14:20.237465053 +0000 UTC m=+1365.320048681" observedRunningTime="2026-03-21 05:14:21.140654341 +0000 UTC m=+1366.223237989" watchObservedRunningTime="2026-03-21 05:14:21.147097353 +0000 UTC m=+1366.229680991" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.177212 4580 scope.go:117] "RemoveContainer" containerID="9813aa78da4b6d40fe72511148ffa7827baa3bca2fdcafe18f4d6af83fe0fbb2" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.180807 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.183201 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.305904 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-httpd-config\") pod \"33984e31-23ff-4d28-9828-74d12b7fc0a7\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.306079 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-scripts\") pod \"0694bb24-df06-41c3-a24d-8428090b6df4\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.306275 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-combined-ca-bundle\") pod \"33984e31-23ff-4d28-9828-74d12b7fc0a7\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.306315 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-config-data\") pod \"0694bb24-df06-41c3-a24d-8428090b6df4\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.306409 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-ovndb-tls-certs\") pod \"33984e31-23ff-4d28-9828-74d12b7fc0a7\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.306436 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0694bb24-df06-41c3-a24d-8428090b6df4-log-httpd\") pod \"0694bb24-df06-41c3-a24d-8428090b6df4\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.306496 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-sg-core-conf-yaml\") pod \"0694bb24-df06-41c3-a24d-8428090b6df4\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.306529 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-combined-ca-bundle\") pod \"0694bb24-df06-41c3-a24d-8428090b6df4\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.306583 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-config\") pod \"33984e31-23ff-4d28-9828-74d12b7fc0a7\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.306660 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm6vf\" (UniqueName: \"kubernetes.io/projected/0694bb24-df06-41c3-a24d-8428090b6df4-kube-api-access-zm6vf\") pod \"0694bb24-df06-41c3-a24d-8428090b6df4\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.306708 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9qgj\" (UniqueName: \"kubernetes.io/projected/33984e31-23ff-4d28-9828-74d12b7fc0a7-kube-api-access-g9qgj\") pod \"33984e31-23ff-4d28-9828-74d12b7fc0a7\" (UID: \"33984e31-23ff-4d28-9828-74d12b7fc0a7\") " Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.306730 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0694bb24-df06-41c3-a24d-8428090b6df4-run-httpd\") pod \"0694bb24-df06-41c3-a24d-8428090b6df4\" (UID: \"0694bb24-df06-41c3-a24d-8428090b6df4\") " Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.307709 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0694bb24-df06-41c3-a24d-8428090b6df4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0694bb24-df06-41c3-a24d-8428090b6df4" (UID: "0694bb24-df06-41c3-a24d-8428090b6df4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.337352 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0694bb24-df06-41c3-a24d-8428090b6df4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0694bb24-df06-41c3-a24d-8428090b6df4" (UID: "0694bb24-df06-41c3-a24d-8428090b6df4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.359231 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33984e31-23ff-4d28-9828-74d12b7fc0a7-kube-api-access-g9qgj" (OuterVolumeSpecName: "kube-api-access-g9qgj") pod "33984e31-23ff-4d28-9828-74d12b7fc0a7" (UID: "33984e31-23ff-4d28-9828-74d12b7fc0a7"). InnerVolumeSpecName "kube-api-access-g9qgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.359798 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-scripts" (OuterVolumeSpecName: "scripts") pod "0694bb24-df06-41c3-a24d-8428090b6df4" (UID: "0694bb24-df06-41c3-a24d-8428090b6df4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.359907 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "33984e31-23ff-4d28-9828-74d12b7fc0a7" (UID: "33984e31-23ff-4d28-9828-74d12b7fc0a7"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.371614 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0694bb24-df06-41c3-a24d-8428090b6df4-kube-api-access-zm6vf" (OuterVolumeSpecName: "kube-api-access-zm6vf") pod "0694bb24-df06-41c3-a24d-8428090b6df4" (UID: "0694bb24-df06-41c3-a24d-8428090b6df4"). InnerVolumeSpecName "kube-api-access-zm6vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.394917 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.394983 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.409065 4580 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0694bb24-df06-41c3-a24d-8428090b6df4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.409103 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm6vf\" (UniqueName: \"kubernetes.io/projected/0694bb24-df06-41c3-a24d-8428090b6df4-kube-api-access-zm6vf\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.409117 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9qgj\" (UniqueName: \"kubernetes.io/projected/33984e31-23ff-4d28-9828-74d12b7fc0a7-kube-api-access-g9qgj\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.409128 4580 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0694bb24-df06-41c3-a24d-8428090b6df4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.409141 4580 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.409150 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.427985 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0694bb24-df06-41c3-a24d-8428090b6df4" (UID: "0694bb24-df06-41c3-a24d-8428090b6df4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.440880 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.472719 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.486074 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-config" (OuterVolumeSpecName: "config") pod "33984e31-23ff-4d28-9828-74d12b7fc0a7" (UID: "33984e31-23ff-4d28-9828-74d12b7fc0a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.501859 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:14:21 crc kubenswrapper[4580]: E0321 05:14:21.502373 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8267ff8-754f-446f-a7b1-94462f0f9c93" containerName="cinder-api-log" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502391 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8267ff8-754f-446f-a7b1-94462f0f9c93" containerName="cinder-api-log" Mar 21 05:14:21 crc kubenswrapper[4580]: E0321 05:14:21.502409 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="ceilometer-central-agent" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502416 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="ceilometer-central-agent" Mar 21 05:14:21 crc kubenswrapper[4580]: E0321 05:14:21.502441 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8267ff8-754f-446f-a7b1-94462f0f9c93" containerName="cinder-api" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502447 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8267ff8-754f-446f-a7b1-94462f0f9c93" containerName="cinder-api" Mar 21 05:14:21 crc kubenswrapper[4580]: E0321 05:14:21.502460 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33984e31-23ff-4d28-9828-74d12b7fc0a7" containerName="neutron-api" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502466 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="33984e31-23ff-4d28-9828-74d12b7fc0a7" containerName="neutron-api" Mar 21 05:14:21 crc kubenswrapper[4580]: E0321 05:14:21.502483 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33984e31-23ff-4d28-9828-74d12b7fc0a7" containerName="neutron-httpd" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502490 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="33984e31-23ff-4d28-9828-74d12b7fc0a7" containerName="neutron-httpd" Mar 21 05:14:21 crc kubenswrapper[4580]: E0321 05:14:21.502517 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa7b0b4-ac59-4338-896e-723db48b3d24" containerName="kube-state-metrics" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502525 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa7b0b4-ac59-4338-896e-723db48b3d24" containerName="kube-state-metrics" Mar 21 05:14:21 crc kubenswrapper[4580]: E0321 05:14:21.502538 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="ceilometer-notification-agent" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502544 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="ceilometer-notification-agent" Mar 21 05:14:21 crc kubenswrapper[4580]: E0321 05:14:21.502554 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="sg-core" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502595 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="sg-core" Mar 21 05:14:21 crc kubenswrapper[4580]: E0321 05:14:21.502606 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="proxy-httpd" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502612 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="proxy-httpd" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502882 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="ceilometer-central-agent" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502904 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="33984e31-23ff-4d28-9828-74d12b7fc0a7" containerName="neutron-httpd" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502915 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="33984e31-23ff-4d28-9828-74d12b7fc0a7" containerName="neutron-api" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502925 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="sg-core" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502932 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa7b0b4-ac59-4338-896e-723db48b3d24" containerName="kube-state-metrics" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502957 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="ceilometer-notification-agent" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502968 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" containerName="proxy-httpd" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502981 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8267ff8-754f-446f-a7b1-94462f0f9c93" containerName="cinder-api-log" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.502993 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8267ff8-754f-446f-a7b1-94462f0f9c93" containerName="cinder-api" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.504457 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.507090 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.507151 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.510873 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.510903 4580 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.512870 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.513185 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.514140 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.520292 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33984e31-23ff-4d28-9828-74d12b7fc0a7" (UID: "33984e31-23ff-4d28-9828-74d12b7fc0a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.532501 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.613372 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf9b9\" (UniqueName: \"kubernetes.io/projected/18848c19-7735-494d-babb-32e04c8ef382-kube-api-access-hf9b9\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.613431 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.613481 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.613509 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-config-data\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.613546 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-scripts\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.613566 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-config-data-custom\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.613598 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-public-tls-certs\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.613631 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18848c19-7735-494d-babb-32e04c8ef382-logs\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.613655 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18848c19-7735-494d-babb-32e04c8ef382-etc-machine-id\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.613700 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.633096 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8267ff8-754f-446f-a7b1-94462f0f9c93" path="/var/lib/kubelet/pods/d8267ff8-754f-446f-a7b1-94462f0f9c93/volumes" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.680301 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "33984e31-23ff-4d28-9828-74d12b7fc0a7" (UID: "33984e31-23ff-4d28-9828-74d12b7fc0a7"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.697059 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0694bb24-df06-41c3-a24d-8428090b6df4" (UID: "0694bb24-df06-41c3-a24d-8428090b6df4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.697382 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-config-data" (OuterVolumeSpecName: "config-data") pod "0694bb24-df06-41c3-a24d-8428090b6df4" (UID: "0694bb24-df06-41c3-a24d-8428090b6df4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.716082 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.716138 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-config-data\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.716189 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-scripts\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.716215 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-config-data-custom\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.716268 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-public-tls-certs\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.716306 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18848c19-7735-494d-babb-32e04c8ef382-logs\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.716332 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18848c19-7735-494d-babb-32e04c8ef382-etc-machine-id\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.716367 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf9b9\" (UniqueName: \"kubernetes.io/projected/18848c19-7735-494d-babb-32e04c8ef382-kube-api-access-hf9b9\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.716401 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.716493 4580 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33984e31-23ff-4d28-9828-74d12b7fc0a7-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.716512 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.716525 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0694bb24-df06-41c3-a24d-8428090b6df4-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.717443 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18848c19-7735-494d-babb-32e04c8ef382-logs\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.720816 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-config-data-custom\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.721925 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.721963 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18848c19-7735-494d-babb-32e04c8ef382-etc-machine-id\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.722183 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-public-tls-certs\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.726307 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-config-data\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.726665 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.727306 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18848c19-7735-494d-babb-32e04c8ef382-scripts\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.745507 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf9b9\" (UniqueName: \"kubernetes.io/projected/18848c19-7735-494d-babb-32e04c8ef382-kube-api-access-hf9b9\") pod \"cinder-api-0\" (UID: \"18848c19-7735-494d-babb-32e04c8ef382\") " pod="openstack/cinder-api-0" Mar 21 05:14:21 crc kubenswrapper[4580]: I0321 05:14:21.841756 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.124199 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0694bb24-df06-41c3-a24d-8428090b6df4","Type":"ContainerDied","Data":"b1e89d7d04d9886474f61f3afde2b48891dabecaf03e02ea3ad4092a37ba7e47"} Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.124514 4580 scope.go:117] "RemoveContainer" containerID="b83fd3821ac6334371330cc325afa955d8d19b6906ad78933d2d62d6b4ec28eb" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.124675 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.140959 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57d468d6b8-4xwvd" event={"ID":"33984e31-23ff-4d28-9828-74d12b7fc0a7","Type":"ContainerDied","Data":"1885fea2f8dae75f1673f3b37957d723ebf32926c2610642db44e5b549a9ae1e"} Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.141039 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57d468d6b8-4xwvd" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.159510 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.159735 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6dbb667f95-c5g4x" event={"ID":"21065819-f94d-4cc9-925f-c4be4eeee0d7","Type":"ContainerStarted","Data":"2f7a23bfd061b1034db1a0c73dcdd8ea9b3b2a7665f7ae99e3dc730477b52c02"} Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.199152 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6dbb667f95-c5g4x" podStartSLOduration=12.199134739 podStartE2EDuration="12.199134739s" podCreationTimestamp="2026-03-21 05:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:14:22.187432616 +0000 UTC m=+1367.270016264" watchObservedRunningTime="2026-03-21 05:14:22.199134739 +0000 UTC m=+1367.281718357" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.267948 4580 scope.go:117] "RemoveContainer" containerID="db67d4321b5cd6dafcdc1ed5a2a09ad8d3610f33d42492f866565584c063d3d2" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.286132 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.317658 4580 scope.go:117] "RemoveContainer" containerID="1e32fbc06eb17d4fbb9d9fd6f9781c6231cf73fa6cc7275bb60b3fbd2ba6c4bf" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.323334 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.340355 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.356191 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.360301 4580 scope.go:117] "RemoveContainer" containerID="13cdf4bc25a2a4ff4f44203ac90f53c94b3c3f7df282de0ab9155d6b47bc2473" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.365861 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.368229 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.372722 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.372914 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-szd8w" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.373062 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.384774 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57d468d6b8-4xwvd"] Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.418942 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.423041 4580 scope.go:117] "RemoveContainer" containerID="d79d9b98b19c66dc420061e3d3ea414cafbef44bb76a885281ddb24f30144e6c" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.452707 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.452816 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.453068 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-scripts\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.453164 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-run-httpd\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.453206 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-config-data\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.453241 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97d42\" (UniqueName: \"kubernetes.io/projected/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-kube-api-access-97d42\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.453561 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-log-httpd\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.502710 4580 scope.go:117] "RemoveContainer" containerID="d13a268270d73439428136cc30f637118603cadce1019c8dd25bc9f0ac7c91d1" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.511562 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.511628 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-57d468d6b8-4xwvd"] Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.511662 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.511821 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.524237 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.524435 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.567190 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-run-httpd\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.567252 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-config-data\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.567285 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97d42\" (UniqueName: \"kubernetes.io/projected/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-kube-api-access-97d42\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.567327 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-log-httpd\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.567434 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.567490 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.567547 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-scripts\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.570955 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-run-httpd\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.572663 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-log-httpd\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.578866 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.586852 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.594119 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.603683 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-config-data\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.623636 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-scripts\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.631428 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97d42\" (UniqueName: \"kubernetes.io/projected/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-kube-api-access-97d42\") pod \"ceilometer-0\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.673049 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca4bc346-fdf8-4e43-8bbb-ea6c80333c43-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ca4bc346-fdf8-4e43-8bbb-ea6c80333c43\") " pod="openstack/kube-state-metrics-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.673125 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcjqm\" (UniqueName: \"kubernetes.io/projected/ca4bc346-fdf8-4e43-8bbb-ea6c80333c43-kube-api-access-xcjqm\") pod \"kube-state-metrics-0\" (UID: \"ca4bc346-fdf8-4e43-8bbb-ea6c80333c43\") " pod="openstack/kube-state-metrics-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.673155 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4bc346-fdf8-4e43-8bbb-ea6c80333c43-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ca4bc346-fdf8-4e43-8bbb-ea6c80333c43\") " pod="openstack/kube-state-metrics-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.673233 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ca4bc346-fdf8-4e43-8bbb-ea6c80333c43-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ca4bc346-fdf8-4e43-8bbb-ea6c80333c43\") " pod="openstack/kube-state-metrics-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.732548 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.776292 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca4bc346-fdf8-4e43-8bbb-ea6c80333c43-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ca4bc346-fdf8-4e43-8bbb-ea6c80333c43\") " pod="openstack/kube-state-metrics-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.776445 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcjqm\" (UniqueName: \"kubernetes.io/projected/ca4bc346-fdf8-4e43-8bbb-ea6c80333c43-kube-api-access-xcjqm\") pod \"kube-state-metrics-0\" (UID: \"ca4bc346-fdf8-4e43-8bbb-ea6c80333c43\") " pod="openstack/kube-state-metrics-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.776507 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4bc346-fdf8-4e43-8bbb-ea6c80333c43-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ca4bc346-fdf8-4e43-8bbb-ea6c80333c43\") " pod="openstack/kube-state-metrics-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.776664 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ca4bc346-fdf8-4e43-8bbb-ea6c80333c43-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ca4bc346-fdf8-4e43-8bbb-ea6c80333c43\") " pod="openstack/kube-state-metrics-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.784947 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca4bc346-fdf8-4e43-8bbb-ea6c80333c43-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ca4bc346-fdf8-4e43-8bbb-ea6c80333c43\") " pod="openstack/kube-state-metrics-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.788311 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ca4bc346-fdf8-4e43-8bbb-ea6c80333c43-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ca4bc346-fdf8-4e43-8bbb-ea6c80333c43\") " pod="openstack/kube-state-metrics-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.794262 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4bc346-fdf8-4e43-8bbb-ea6c80333c43-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ca4bc346-fdf8-4e43-8bbb-ea6c80333c43\") " pod="openstack/kube-state-metrics-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.798976 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcjqm\" (UniqueName: \"kubernetes.io/projected/ca4bc346-fdf8-4e43-8bbb-ea6c80333c43-kube-api-access-xcjqm\") pod \"kube-state-metrics-0\" (UID: \"ca4bc346-fdf8-4e43-8bbb-ea6c80333c43\") " pod="openstack/kube-state-metrics-0" Mar 21 05:14:22 crc kubenswrapper[4580]: I0321 05:14:22.867919 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 05:14:23 crc kubenswrapper[4580]: I0321 05:14:23.217130 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"18848c19-7735-494d-babb-32e04c8ef382","Type":"ContainerStarted","Data":"809e0736d17aed6552a5a6077c89bae2895cd9695138480366c18cc0e0ebb117"} Mar 21 05:14:23 crc kubenswrapper[4580]: I0321 05:14:23.217446 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:23 crc kubenswrapper[4580]: I0321 05:14:23.217488 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:23 crc kubenswrapper[4580]: I0321 05:14:23.348329 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:14:23 crc kubenswrapper[4580]: I0321 05:14:23.570760 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:14:23 crc kubenswrapper[4580]: I0321 05:14:23.635069 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0694bb24-df06-41c3-a24d-8428090b6df4" path="/var/lib/kubelet/pods/0694bb24-df06-41c3-a24d-8428090b6df4/volumes" Mar 21 05:14:23 crc kubenswrapper[4580]: I0321 05:14:23.636709 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33984e31-23ff-4d28-9828-74d12b7fc0a7" path="/var/lib/kubelet/pods/33984e31-23ff-4d28-9828-74d12b7fc0a7/volumes" Mar 21 05:14:23 crc kubenswrapper[4580]: I0321 05:14:23.637355 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caa7b0b4-ac59-4338-896e-723db48b3d24" path="/var/lib/kubelet/pods/caa7b0b4-ac59-4338-896e-723db48b3d24/volumes" Mar 21 05:14:23 crc kubenswrapper[4580]: I0321 05:14:23.798569 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:14:24 crc kubenswrapper[4580]: I0321 05:14:24.250142 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"18848c19-7735-494d-babb-32e04c8ef382","Type":"ContainerStarted","Data":"e1fe2a8a3126f3ebf7c382db5896379cba1e5658fa4a94793b216a57d8b5460b"} Mar 21 05:14:24 crc kubenswrapper[4580]: I0321 05:14:24.256014 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4","Type":"ContainerStarted","Data":"da60e517c145c3d6544dede0d54495ae89e326f2c0cabb3e9e534a812902a5fe"} Mar 21 05:14:24 crc kubenswrapper[4580]: I0321 05:14:24.275605 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ca4bc346-fdf8-4e43-8bbb-ea6c80333c43","Type":"ContainerStarted","Data":"3c9ca2bee6f1070fa187975bffa10e4fe0e5d131b44b61e93ffd1ef438ea5ca1"} Mar 21 05:14:25 crc kubenswrapper[4580]: I0321 05:14:25.286830 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"18848c19-7735-494d-babb-32e04c8ef382","Type":"ContainerStarted","Data":"487e6cefbaf4d854dd93496db3c1d386a48085c09cb9f276febcded634a71115"} Mar 21 05:14:25 crc kubenswrapper[4580]: I0321 05:14:25.287444 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 21 05:14:25 crc kubenswrapper[4580]: I0321 05:14:25.290135 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ca4bc346-fdf8-4e43-8bbb-ea6c80333c43","Type":"ContainerStarted","Data":"e9988acb1075240c8ee162ecb073602f6d72f070a69bb2eec0141b3787286b44"} Mar 21 05:14:25 crc kubenswrapper[4580]: I0321 05:14:25.290770 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 21 05:14:25 crc kubenswrapper[4580]: I0321 05:14:25.293153 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4","Type":"ContainerStarted","Data":"ec6a90feef6df9210a56ba0234c2766bf2f184c0f2c9fc8765c4f5b30480a682"} Mar 21 05:14:25 crc kubenswrapper[4580]: I0321 05:14:25.320613 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.3205868689999996 podStartE2EDuration="4.320586869s" podCreationTimestamp="2026-03-21 05:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:14:25.311065685 +0000 UTC m=+1370.393649333" watchObservedRunningTime="2026-03-21 05:14:25.320586869 +0000 UTC m=+1370.403170517" Mar 21 05:14:25 crc kubenswrapper[4580]: I0321 05:14:25.331827 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6dbb667f95-c5g4x" podUID="21065819-f94d-4cc9-925f-c4be4eeee0d7" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 21 05:14:25 crc kubenswrapper[4580]: I0321 05:14:25.337648 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.757583698 podStartE2EDuration="3.337625915s" podCreationTimestamp="2026-03-21 05:14:22 +0000 UTC" firstStartedPulling="2026-03-21 05:14:23.606577659 +0000 UTC m=+1368.689161287" lastFinishedPulling="2026-03-21 05:14:24.186619886 +0000 UTC m=+1369.269203504" observedRunningTime="2026-03-21 05:14:25.330154505 +0000 UTC m=+1370.412738143" watchObservedRunningTime="2026-03-21 05:14:25.337625915 +0000 UTC m=+1370.420209543" Mar 21 05:14:26 crc kubenswrapper[4580]: I0321 05:14:26.303313 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4","Type":"ContainerStarted","Data":"6d95b87e34ca007e4598f9fa298e093ee45b4e68ec1b3518928111cc68259c1a"} Mar 21 05:14:26 crc kubenswrapper[4580]: I0321 05:14:26.350929 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:26 crc kubenswrapper[4580]: I0321 05:14:26.443459 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6dbb667f95-c5g4x" Mar 21 05:14:27 crc kubenswrapper[4580]: I0321 05:14:27.023302 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:14:27 crc kubenswrapper[4580]: I0321 05:14:27.320875 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4","Type":"ContainerStarted","Data":"1d12cac99e2250bd20d1b9eda54fccc6481e73ae9673a65c6438d14333d09658"} Mar 21 05:14:27 crc kubenswrapper[4580]: I0321 05:14:27.396943 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67f74b898d-dtzvd" Mar 21 05:14:27 crc kubenswrapper[4580]: I0321 05:14:27.487231 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-55886d54c6-b2qbq"] Mar 21 05:14:27 crc kubenswrapper[4580]: I0321 05:14:27.487904 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-55886d54c6-b2qbq" podUID="b91b8cf1-cf3f-4789-b232-03aeea8f47ef" containerName="placement-log" containerID="cri-o://e8eb77b909e1c1f8309a03f5e03250e60939007e839e0a195f74647abdf86540" gracePeriod=30 Mar 21 05:14:27 crc kubenswrapper[4580]: I0321 05:14:27.487960 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-55886d54c6-b2qbq" podUID="b91b8cf1-cf3f-4789-b232-03aeea8f47ef" containerName="placement-api" containerID="cri-o://f847a856383c9da180c3eb5c0b37dc8b9361946501a3cd575d400fbbf48a0b86" gracePeriod=30 Mar 21 05:14:28 crc kubenswrapper[4580]: I0321 05:14:28.338634 4580 generic.go:334] "Generic (PLEG): container finished" podID="b91b8cf1-cf3f-4789-b232-03aeea8f47ef" containerID="e8eb77b909e1c1f8309a03f5e03250e60939007e839e0a195f74647abdf86540" exitCode=143 Mar 21 05:14:28 crc kubenswrapper[4580]: I0321 05:14:28.338701 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55886d54c6-b2qbq" event={"ID":"b91b8cf1-cf3f-4789-b232-03aeea8f47ef","Type":"ContainerDied","Data":"e8eb77b909e1c1f8309a03f5e03250e60939007e839e0a195f74647abdf86540"} Mar 21 05:14:29 crc kubenswrapper[4580]: I0321 05:14:29.352080 4580 generic.go:334] "Generic (PLEG): container finished" podID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerID="06d40532a000f5070c7b72a587cc49ac7da9bf2b5aaac16aab263f4f9950bff0" exitCode=1 Mar 21 05:14:29 crc kubenswrapper[4580]: I0321 05:14:29.352157 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4","Type":"ContainerDied","Data":"06d40532a000f5070c7b72a587cc49ac7da9bf2b5aaac16aab263f4f9950bff0"} Mar 21 05:14:29 crc kubenswrapper[4580]: I0321 05:14:29.352255 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerName="ceilometer-central-agent" containerID="cri-o://ec6a90feef6df9210a56ba0234c2766bf2f184c0f2c9fc8765c4f5b30480a682" gracePeriod=30 Mar 21 05:14:29 crc kubenswrapper[4580]: I0321 05:14:29.352296 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerName="sg-core" containerID="cri-o://1d12cac99e2250bd20d1b9eda54fccc6481e73ae9673a65c6438d14333d09658" gracePeriod=30 Mar 21 05:14:29 crc kubenswrapper[4580]: I0321 05:14:29.352319 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerName="ceilometer-notification-agent" containerID="cri-o://6d95b87e34ca007e4598f9fa298e093ee45b4e68ec1b3518928111cc68259c1a" gracePeriod=30 Mar 21 05:14:29 crc kubenswrapper[4580]: I0321 05:14:29.729060 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:14:29 crc kubenswrapper[4580]: I0321 05:14:29.729414 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c812288c-8500-4ea0-b8e8-b835bce24ac1" containerName="glance-log" containerID="cri-o://cc0760a3da7aab4698e56cd44ccdd1b7aab1a15cc610061aa17a82b49df8264d" gracePeriod=30 Mar 21 05:14:29 crc kubenswrapper[4580]: I0321 05:14:29.729571 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c812288c-8500-4ea0-b8e8-b835bce24ac1" containerName="glance-httpd" containerID="cri-o://323da36ba85fcdd25d911164fac79635954bffb8abe58043bddadf33f713b412" gracePeriod=30 Mar 21 05:14:30 crc kubenswrapper[4580]: I0321 05:14:30.364807 4580 generic.go:334] "Generic (PLEG): container finished" podID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerID="1d12cac99e2250bd20d1b9eda54fccc6481e73ae9673a65c6438d14333d09658" exitCode=2 Mar 21 05:14:30 crc kubenswrapper[4580]: I0321 05:14:30.364878 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4","Type":"ContainerDied","Data":"1d12cac99e2250bd20d1b9eda54fccc6481e73ae9673a65c6438d14333d09658"} Mar 21 05:14:30 crc kubenswrapper[4580]: I0321 05:14:30.367409 4580 generic.go:334] "Generic (PLEG): container finished" podID="c812288c-8500-4ea0-b8e8-b835bce24ac1" containerID="cc0760a3da7aab4698e56cd44ccdd1b7aab1a15cc610061aa17a82b49df8264d" exitCode=143 Mar 21 05:14:30 crc kubenswrapper[4580]: I0321 05:14:30.367446 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c812288c-8500-4ea0-b8e8-b835bce24ac1","Type":"ContainerDied","Data":"cc0760a3da7aab4698e56cd44ccdd1b7aab1a15cc610061aa17a82b49df8264d"} Mar 21 05:14:31 crc kubenswrapper[4580]: I0321 05:14:31.377628 4580 generic.go:334] "Generic (PLEG): container finished" podID="b91b8cf1-cf3f-4789-b232-03aeea8f47ef" containerID="f847a856383c9da180c3eb5c0b37dc8b9361946501a3cd575d400fbbf48a0b86" exitCode=0 Mar 21 05:14:31 crc kubenswrapper[4580]: I0321 05:14:31.377664 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55886d54c6-b2qbq" event={"ID":"b91b8cf1-cf3f-4789-b232-03aeea8f47ef","Type":"ContainerDied","Data":"f847a856383c9da180c3eb5c0b37dc8b9361946501a3cd575d400fbbf48a0b86"} Mar 21 05:14:31 crc kubenswrapper[4580]: I0321 05:14:31.401252 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:14:31 crc kubenswrapper[4580]: I0321 05:14:31.511440 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 21 05:14:31 crc kubenswrapper[4580]: I0321 05:14:31.839728 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:14:31 crc kubenswrapper[4580]: I0321 05:14:31.938566 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-combined-ca-bundle\") pod \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " Mar 21 05:14:31 crc kubenswrapper[4580]: I0321 05:14:31.938642 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-internal-tls-certs\") pod \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " Mar 21 05:14:31 crc kubenswrapper[4580]: I0321 05:14:31.938711 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db2cc\" (UniqueName: \"kubernetes.io/projected/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-kube-api-access-db2cc\") pod \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " Mar 21 05:14:31 crc kubenswrapper[4580]: I0321 05:14:31.938818 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-scripts\") pod \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " Mar 21 05:14:31 crc kubenswrapper[4580]: I0321 05:14:31.938850 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-config-data\") pod \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " Mar 21 05:14:31 crc kubenswrapper[4580]: I0321 05:14:31.938911 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-logs\") pod \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " Mar 21 05:14:31 crc kubenswrapper[4580]: I0321 05:14:31.938956 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-public-tls-certs\") pod \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\" (UID: \"b91b8cf1-cf3f-4789-b232-03aeea8f47ef\") " Mar 21 05:14:31 crc kubenswrapper[4580]: I0321 05:14:31.942753 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-logs" (OuterVolumeSpecName: "logs") pod "b91b8cf1-cf3f-4789-b232-03aeea8f47ef" (UID: "b91b8cf1-cf3f-4789-b232-03aeea8f47ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:14:31 crc kubenswrapper[4580]: I0321 05:14:31.956941 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-kube-api-access-db2cc" (OuterVolumeSpecName: "kube-api-access-db2cc") pod "b91b8cf1-cf3f-4789-b232-03aeea8f47ef" (UID: "b91b8cf1-cf3f-4789-b232-03aeea8f47ef"). InnerVolumeSpecName "kube-api-access-db2cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:31 crc kubenswrapper[4580]: I0321 05:14:31.988509 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-scripts" (OuterVolumeSpecName: "scripts") pod "b91b8cf1-cf3f-4789-b232-03aeea8f47ef" (UID: "b91b8cf1-cf3f-4789-b232-03aeea8f47ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.041901 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.041951 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db2cc\" (UniqueName: \"kubernetes.io/projected/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-kube-api-access-db2cc\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.041966 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.165940 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-config-data" (OuterVolumeSpecName: "config-data") pod "b91b8cf1-cf3f-4789-b232-03aeea8f47ef" (UID: "b91b8cf1-cf3f-4789-b232-03aeea8f47ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.166368 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b91b8cf1-cf3f-4789-b232-03aeea8f47ef" (UID: "b91b8cf1-cf3f-4789-b232-03aeea8f47ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.246514 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.247077 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.251858 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b91b8cf1-cf3f-4789-b232-03aeea8f47ef" (UID: "b91b8cf1-cf3f-4789-b232-03aeea8f47ef"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.308291 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b91b8cf1-cf3f-4789-b232-03aeea8f47ef" (UID: "b91b8cf1-cf3f-4789-b232-03aeea8f47ef"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.348725 4580 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.348759 4580 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91b8cf1-cf3f-4789-b232-03aeea8f47ef-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.388355 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55886d54c6-b2qbq" event={"ID":"b91b8cf1-cf3f-4789-b232-03aeea8f47ef","Type":"ContainerDied","Data":"fe6857fa07e803052bfcb06c7c1f015220a0fbb02e6d02a75107838425b2f8a1"} Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.388407 4580 scope.go:117] "RemoveContainer" containerID="f847a856383c9da180c3eb5c0b37dc8b9361946501a3cd575d400fbbf48a0b86" Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.388420 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55886d54c6-b2qbq" Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.421488 4580 scope.go:117] "RemoveContainer" containerID="e8eb77b909e1c1f8309a03f5e03250e60939007e839e0a195f74647abdf86540" Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.448949 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-55886d54c6-b2qbq"] Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.466958 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-55886d54c6-b2qbq"] Mar 21 05:14:32 crc kubenswrapper[4580]: I0321 05:14:32.900553 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.295023 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.295260 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" containerName="glance-log" containerID="cri-o://1611d88ed1574f196250e15c569749879849b59f34652e475b3b7008de9a4f79" gracePeriod=30 Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.295700 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" containerName="glance-httpd" containerID="cri-o://8eed586d42a9da62f78732e2bf8bcfd250c213f5e900feb05784cb50f738502b" gracePeriod=30 Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.398797 4580 generic.go:334] "Generic (PLEG): container finished" podID="c812288c-8500-4ea0-b8e8-b835bce24ac1" containerID="323da36ba85fcdd25d911164fac79635954bffb8abe58043bddadf33f713b412" exitCode=0 Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.398906 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c812288c-8500-4ea0-b8e8-b835bce24ac1","Type":"ContainerDied","Data":"323da36ba85fcdd25d911164fac79635954bffb8abe58043bddadf33f713b412"} Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.633169 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b91b8cf1-cf3f-4789-b232-03aeea8f47ef" path="/var/lib/kubelet/pods/b91b8cf1-cf3f-4789-b232-03aeea8f47ef/volumes" Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.909279 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.982445 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhzrd\" (UniqueName: \"kubernetes.io/projected/c812288c-8500-4ea0-b8e8-b835bce24ac1-kube-api-access-zhzrd\") pod \"c812288c-8500-4ea0-b8e8-b835bce24ac1\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.982508 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-config-data\") pod \"c812288c-8500-4ea0-b8e8-b835bce24ac1\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.982548 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c812288c-8500-4ea0-b8e8-b835bce24ac1-httpd-run\") pod \"c812288c-8500-4ea0-b8e8-b835bce24ac1\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.982575 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c812288c-8500-4ea0-b8e8-b835bce24ac1-logs\") pod \"c812288c-8500-4ea0-b8e8-b835bce24ac1\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.982619 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-combined-ca-bundle\") pod \"c812288c-8500-4ea0-b8e8-b835bce24ac1\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.982648 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"c812288c-8500-4ea0-b8e8-b835bce24ac1\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.982665 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-scripts\") pod \"c812288c-8500-4ea0-b8e8-b835bce24ac1\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.982702 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-public-tls-certs\") pod \"c812288c-8500-4ea0-b8e8-b835bce24ac1\" (UID: \"c812288c-8500-4ea0-b8e8-b835bce24ac1\") " Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.983142 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c812288c-8500-4ea0-b8e8-b835bce24ac1-logs" (OuterVolumeSpecName: "logs") pod "c812288c-8500-4ea0-b8e8-b835bce24ac1" (UID: "c812288c-8500-4ea0-b8e8-b835bce24ac1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.983427 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c812288c-8500-4ea0-b8e8-b835bce24ac1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c812288c-8500-4ea0-b8e8-b835bce24ac1" (UID: "c812288c-8500-4ea0-b8e8-b835bce24ac1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.992485 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-scripts" (OuterVolumeSpecName: "scripts") pod "c812288c-8500-4ea0-b8e8-b835bce24ac1" (UID: "c812288c-8500-4ea0-b8e8-b835bce24ac1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:33 crc kubenswrapper[4580]: I0321 05:14:33.994952 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "c812288c-8500-4ea0-b8e8-b835bce24ac1" (UID: "c812288c-8500-4ea0-b8e8-b835bce24ac1"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.020257 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c812288c-8500-4ea0-b8e8-b835bce24ac1-kube-api-access-zhzrd" (OuterVolumeSpecName: "kube-api-access-zhzrd") pod "c812288c-8500-4ea0-b8e8-b835bce24ac1" (UID: "c812288c-8500-4ea0-b8e8-b835bce24ac1"). InnerVolumeSpecName "kube-api-access-zhzrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.052920 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c812288c-8500-4ea0-b8e8-b835bce24ac1" (UID: "c812288c-8500-4ea0-b8e8-b835bce24ac1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.085047 4580 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c812288c-8500-4ea0-b8e8-b835bce24ac1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.085078 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c812288c-8500-4ea0-b8e8-b835bce24ac1-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.085087 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.085109 4580 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.085120 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.085130 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhzrd\" (UniqueName: \"kubernetes.io/projected/c812288c-8500-4ea0-b8e8-b835bce24ac1-kube-api-access-zhzrd\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.107836 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-config-data" (OuterVolumeSpecName: "config-data") pod "c812288c-8500-4ea0-b8e8-b835bce24ac1" (UID: "c812288c-8500-4ea0-b8e8-b835bce24ac1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.115118 4580 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.130947 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c812288c-8500-4ea0-b8e8-b835bce24ac1" (UID: "c812288c-8500-4ea0-b8e8-b835bce24ac1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.187974 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.188216 4580 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.188355 4580 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c812288c-8500-4ea0-b8e8-b835bce24ac1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.407898 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c812288c-8500-4ea0-b8e8-b835bce24ac1","Type":"ContainerDied","Data":"10e6ef65272e29c7629603520473bc0c71e68674079d4ee2c2e8b5e0f40dd651"} Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.407948 4580 scope.go:117] "RemoveContainer" containerID="323da36ba85fcdd25d911164fac79635954bffb8abe58043bddadf33f713b412" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.408171 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.417485 4580 generic.go:334] "Generic (PLEG): container finished" podID="b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" containerID="1611d88ed1574f196250e15c569749879849b59f34652e475b3b7008de9a4f79" exitCode=143 Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.417526 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46","Type":"ContainerDied","Data":"1611d88ed1574f196250e15c569749879849b59f34652e475b3b7008de9a4f79"} Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.441575 4580 scope.go:117] "RemoveContainer" containerID="cc0760a3da7aab4698e56cd44ccdd1b7aab1a15cc610061aa17a82b49df8264d" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.446681 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.506871 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.524945 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:14:34 crc kubenswrapper[4580]: E0321 05:14:34.526580 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b91b8cf1-cf3f-4789-b232-03aeea8f47ef" containerName="placement-api" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.526603 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91b8cf1-cf3f-4789-b232-03aeea8f47ef" containerName="placement-api" Mar 21 05:14:34 crc kubenswrapper[4580]: E0321 05:14:34.526640 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c812288c-8500-4ea0-b8e8-b835bce24ac1" containerName="glance-httpd" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.526720 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c812288c-8500-4ea0-b8e8-b835bce24ac1" containerName="glance-httpd" Mar 21 05:14:34 crc kubenswrapper[4580]: E0321 05:14:34.526739 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c812288c-8500-4ea0-b8e8-b835bce24ac1" containerName="glance-log" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.526746 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c812288c-8500-4ea0-b8e8-b835bce24ac1" containerName="glance-log" Mar 21 05:14:34 crc kubenswrapper[4580]: E0321 05:14:34.526791 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b91b8cf1-cf3f-4789-b232-03aeea8f47ef" containerName="placement-log" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.526798 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91b8cf1-cf3f-4789-b232-03aeea8f47ef" containerName="placement-log" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.527717 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="b91b8cf1-cf3f-4789-b232-03aeea8f47ef" containerName="placement-log" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.527758 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="c812288c-8500-4ea0-b8e8-b835bce24ac1" containerName="glance-log" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.527770 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="b91b8cf1-cf3f-4789-b232-03aeea8f47ef" containerName="placement-api" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.527989 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="c812288c-8500-4ea0-b8e8-b835bce24ac1" containerName="glance-httpd" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.531569 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.555107 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.555342 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.561303 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.708730 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/551ce5a9-fc21-4f0c-9c38-d53b829c5979-config-data\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.709255 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/551ce5a9-fc21-4f0c-9c38-d53b829c5979-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.709377 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/551ce5a9-fc21-4f0c-9c38-d53b829c5979-scripts\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.709514 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/551ce5a9-fc21-4f0c-9c38-d53b829c5979-logs\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.709627 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/551ce5a9-fc21-4f0c-9c38-d53b829c5979-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.709732 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.709867 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551ce5a9-fc21-4f0c-9c38-d53b829c5979-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.709945 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6l6x\" (UniqueName: \"kubernetes.io/projected/551ce5a9-fc21-4f0c-9c38-d53b829c5979-kube-api-access-h6l6x\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.812409 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551ce5a9-fc21-4f0c-9c38-d53b829c5979-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.812466 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6l6x\" (UniqueName: \"kubernetes.io/projected/551ce5a9-fc21-4f0c-9c38-d53b829c5979-kube-api-access-h6l6x\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.812533 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/551ce5a9-fc21-4f0c-9c38-d53b829c5979-config-data\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.812570 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/551ce5a9-fc21-4f0c-9c38-d53b829c5979-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.812596 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/551ce5a9-fc21-4f0c-9c38-d53b829c5979-scripts\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.812648 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/551ce5a9-fc21-4f0c-9c38-d53b829c5979-logs\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.812673 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/551ce5a9-fc21-4f0c-9c38-d53b829c5979-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.812709 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.812920 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.821476 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/551ce5a9-fc21-4f0c-9c38-d53b829c5979-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.822095 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/551ce5a9-fc21-4f0c-9c38-d53b829c5979-logs\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.827560 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/551ce5a9-fc21-4f0c-9c38-d53b829c5979-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.827922 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/551ce5a9-fc21-4f0c-9c38-d53b829c5979-scripts\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.837514 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551ce5a9-fc21-4f0c-9c38-d53b829c5979-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.844160 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/551ce5a9-fc21-4f0c-9c38-d53b829c5979-config-data\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.865501 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.873686 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6l6x\" (UniqueName: \"kubernetes.io/projected/551ce5a9-fc21-4f0c-9c38-d53b829c5979-kube-api-access-h6l6x\") pod \"glance-default-external-api-0\" (UID: \"551ce5a9-fc21-4f0c-9c38-d53b829c5979\") " pod="openstack/glance-default-external-api-0" Mar 21 05:14:34 crc kubenswrapper[4580]: I0321 05:14:34.876468 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:14:35 crc kubenswrapper[4580]: I0321 05:14:35.477406 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:14:35 crc kubenswrapper[4580]: I0321 05:14:35.645809 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c812288c-8500-4ea0-b8e8-b835bce24ac1" path="/var/lib/kubelet/pods/c812288c-8500-4ea0-b8e8-b835bce24ac1/volumes" Mar 21 05:14:35 crc kubenswrapper[4580]: I0321 05:14:35.850076 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="18848c19-7735-494d-babb-32e04c8ef382" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.173:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 05:14:36 crc kubenswrapper[4580]: I0321 05:14:36.460717 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"551ce5a9-fc21-4f0c-9c38-d53b829c5979","Type":"ContainerStarted","Data":"bf06c9aa02b35ebbbe3f22301e13c70e27d5c6285bc933ee4416510df5da2dcf"} Mar 21 05:14:36 crc kubenswrapper[4580]: I0321 05:14:36.460762 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"551ce5a9-fc21-4f0c-9c38-d53b829c5979","Type":"ContainerStarted","Data":"60564355acb77f99637ad22d78abad69dc45574a308952dffc6ebf446daf5399"} Mar 21 05:14:36 crc kubenswrapper[4580]: I0321 05:14:36.750505 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.477208 4580 generic.go:334] "Generic (PLEG): container finished" podID="b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" containerID="8eed586d42a9da62f78732e2bf8bcfd250c213f5e900feb05784cb50f738502b" exitCode=0 Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.477726 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46","Type":"ContainerDied","Data":"8eed586d42a9da62f78732e2bf8bcfd250c213f5e900feb05784cb50f738502b"} Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.484923 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"551ce5a9-fc21-4f0c-9c38-d53b829c5979","Type":"ContainerStarted","Data":"9f7b4e13abdb40b00f4cdcc6a7f8ae620e9bde548cba11c5fe6f104b51cb4451"} Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.522136 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.522124238 podStartE2EDuration="3.522124238s" podCreationTimestamp="2026-03-21 05:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:14:37.521717038 +0000 UTC m=+1382.604300676" watchObservedRunningTime="2026-03-21 05:14:37.522124238 +0000 UTC m=+1382.604707866" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.573617 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.684236 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-config-data\") pod \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.684483 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx785\" (UniqueName: \"kubernetes.io/projected/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-kube-api-access-cx785\") pod \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.684571 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-scripts\") pod \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.684679 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-internal-tls-certs\") pod \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.684827 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-logs\") pod \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.685115 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-combined-ca-bundle\") pod \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.685618 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-httpd-run\") pod \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.685729 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\" (UID: \"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46\") " Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.773615 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-logs" (OuterVolumeSpecName: "logs") pod "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" (UID: "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.774663 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" (UID: "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.787893 4580 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.787932 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.821816 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-scripts" (OuterVolumeSpecName: "scripts") pod "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" (UID: "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.860135 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-kube-api-access-cx785" (OuterVolumeSpecName: "kube-api-access-cx785") pod "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" (UID: "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46"). InnerVolumeSpecName "kube-api-access-cx785". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.863703 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" (UID: "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.868275 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" (UID: "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.872043 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-config-data" (OuterVolumeSpecName: "config-data") pod "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" (UID: "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.885244 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" (UID: "b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.889305 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.889540 4580 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.889623 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.889699 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx785\" (UniqueName: \"kubernetes.io/projected/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-kube-api-access-cx785\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.889826 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.889913 4580 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.918888 4580 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 21 05:14:37 crc kubenswrapper[4580]: I0321 05:14:37.991488 4580 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.513557 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46","Type":"ContainerDied","Data":"a73426c1d2aab450042fe0cbc7f723d4988ec8230ad0037303a521689d28ba21"} Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.513924 4580 scope.go:117] "RemoveContainer" containerID="8eed586d42a9da62f78732e2bf8bcfd250c213f5e900feb05784cb50f738502b" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.514097 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.571738 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.580424 4580 scope.go:117] "RemoveContainer" containerID="1611d88ed1574f196250e15c569749879849b59f34652e475b3b7008de9a4f79" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.587041 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.615831 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:14:38 crc kubenswrapper[4580]: E0321 05:14:38.616939 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" containerName="glance-log" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.617056 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" containerName="glance-log" Mar 21 05:14:38 crc kubenswrapper[4580]: E0321 05:14:38.617226 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" containerName="glance-httpd" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.617331 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" containerName="glance-httpd" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.622198 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" containerName="glance-log" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.622342 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" containerName="glance-httpd" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.623731 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.627222 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.627394 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.631447 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.809817 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f151509-94d2-4991-89f7-c7757d14b867-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.809910 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f151509-94d2-4991-89f7-c7757d14b867-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.809955 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f151509-94d2-4991-89f7-c7757d14b867-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.809997 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f151509-94d2-4991-89f7-c7757d14b867-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.810043 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.810083 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f151509-94d2-4991-89f7-c7757d14b867-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.810120 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f151509-94d2-4991-89f7-c7757d14b867-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.810177 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9xxm\" (UniqueName: \"kubernetes.io/projected/5f151509-94d2-4991-89f7-c7757d14b867-kube-api-access-m9xxm\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.911867 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f151509-94d2-4991-89f7-c7757d14b867-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.911913 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f151509-94d2-4991-89f7-c7757d14b867-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.911948 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f151509-94d2-4991-89f7-c7757d14b867-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.911988 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.912017 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f151509-94d2-4991-89f7-c7757d14b867-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.912048 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f151509-94d2-4991-89f7-c7757d14b867-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.912094 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9xxm\" (UniqueName: \"kubernetes.io/projected/5f151509-94d2-4991-89f7-c7757d14b867-kube-api-access-m9xxm\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.912155 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f151509-94d2-4991-89f7-c7757d14b867-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.912821 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f151509-94d2-4991-89f7-c7757d14b867-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.912870 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.912955 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f151509-94d2-4991-89f7-c7757d14b867-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.920804 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f151509-94d2-4991-89f7-c7757d14b867-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.921772 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f151509-94d2-4991-89f7-c7757d14b867-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.924704 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f151509-94d2-4991-89f7-c7757d14b867-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.925249 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f151509-94d2-4991-89f7-c7757d14b867-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:38 crc kubenswrapper[4580]: I0321 05:14:38.979979 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9xxm\" (UniqueName: \"kubernetes.io/projected/5f151509-94d2-4991-89f7-c7757d14b867-kube-api-access-m9xxm\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:39 crc kubenswrapper[4580]: I0321 05:14:39.024259 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f151509-94d2-4991-89f7-c7757d14b867\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:14:39 crc kubenswrapper[4580]: I0321 05:14:39.244239 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:14:39 crc kubenswrapper[4580]: I0321 05:14:39.661580 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46" path="/var/lib/kubelet/pods/b9823fd6-1f6b-4d03-aa58-f4fdba8e7f46/volumes" Mar 21 05:14:39 crc kubenswrapper[4580]: I0321 05:14:39.918962 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:14:40 crc kubenswrapper[4580]: I0321 05:14:40.582173 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f151509-94d2-4991-89f7-c7757d14b867","Type":"ContainerStarted","Data":"8ce6df3cae80171f296fae36798e03cecd39a051a3130d1fdf5ae853c0651f26"} Mar 21 05:14:40 crc kubenswrapper[4580]: I0321 05:14:40.582529 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f151509-94d2-4991-89f7-c7757d14b867","Type":"ContainerStarted","Data":"e418d7ea3d2a109772a4ffeda9691832c116ef545f2e4dcb59b83c2f8822bcf9"} Mar 21 05:14:41 crc kubenswrapper[4580]: I0321 05:14:41.394450 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:14:41 crc kubenswrapper[4580]: I0321 05:14:41.508597 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 21 05:14:41 crc kubenswrapper[4580]: I0321 05:14:41.591920 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f151509-94d2-4991-89f7-c7757d14b867","Type":"ContainerStarted","Data":"2713605a41072246ce1dc6ec56f5ebde5e249e7ae6b0d24092d2a68220818a69"} Mar 21 05:14:41 crc kubenswrapper[4580]: I0321 05:14:41.622761 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.622739158 podStartE2EDuration="3.622739158s" podCreationTimestamp="2026-03-21 05:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:14:41.612310489 +0000 UTC m=+1386.694894137" watchObservedRunningTime="2026-03-21 05:14:41.622739158 +0000 UTC m=+1386.705322786" Mar 21 05:14:44 crc kubenswrapper[4580]: I0321 05:14:44.877606 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 05:14:44 crc kubenswrapper[4580]: I0321 05:14:44.878187 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 05:14:44 crc kubenswrapper[4580]: I0321 05:14:44.923627 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 05:14:44 crc kubenswrapper[4580]: I0321 05:14:44.926602 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 05:14:45 crc kubenswrapper[4580]: I0321 05:14:45.628910 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 05:14:45 crc kubenswrapper[4580]: I0321 05:14:45.628942 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 05:14:45 crc kubenswrapper[4580]: I0321 05:14:45.935129 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-t6f99"] Mar 21 05:14:45 crc kubenswrapper[4580]: I0321 05:14:45.937695 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t6f99" Mar 21 05:14:45 crc kubenswrapper[4580]: I0321 05:14:45.943465 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-t6f99"] Mar 21 05:14:45 crc kubenswrapper[4580]: I0321 05:14:45.947889 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:14:45 crc kubenswrapper[4580]: I0321 05:14:45.947942 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:14:45 crc kubenswrapper[4580]: I0321 05:14:45.947980 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 05:14:45 crc kubenswrapper[4580]: I0321 05:14:45.948693 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0008875a2f7ef6e2119165dc1e0e253e98f01735aec210fb18c6ffa1eebbb281"} pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:14:45 crc kubenswrapper[4580]: I0321 05:14:45.948745 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" containerID="cri-o://0008875a2f7ef6e2119165dc1e0e253e98f01735aec210fb18c6ffa1eebbb281" gracePeriod=600 Mar 21 05:14:45 crc kubenswrapper[4580]: I0321 05:14:45.960975 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzjn5\" (UniqueName: \"kubernetes.io/projected/0bd7c6be-cb25-4e54-9d83-c40520b86d3e-kube-api-access-kzjn5\") pod \"nova-api-db-create-t6f99\" (UID: \"0bd7c6be-cb25-4e54-9d83-c40520b86d3e\") " pod="openstack/nova-api-db-create-t6f99" Mar 21 05:14:45 crc kubenswrapper[4580]: I0321 05:14:45.961135 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bd7c6be-cb25-4e54-9d83-c40520b86d3e-operator-scripts\") pod \"nova-api-db-create-t6f99\" (UID: \"0bd7c6be-cb25-4e54-9d83-c40520b86d3e\") " pod="openstack/nova-api-db-create-t6f99" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.030240 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-mtd7h"] Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.031369 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mtd7h" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.068824 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87mv5\" (UniqueName: \"kubernetes.io/projected/387f8115-6262-4bbf-9277-1452a5b29e47-kube-api-access-87mv5\") pod \"nova-cell0-db-create-mtd7h\" (UID: \"387f8115-6262-4bbf-9277-1452a5b29e47\") " pod="openstack/nova-cell0-db-create-mtd7h" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.068908 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bd7c6be-cb25-4e54-9d83-c40520b86d3e-operator-scripts\") pod \"nova-api-db-create-t6f99\" (UID: \"0bd7c6be-cb25-4e54-9d83-c40520b86d3e\") " pod="openstack/nova-api-db-create-t6f99" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.068929 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/387f8115-6262-4bbf-9277-1452a5b29e47-operator-scripts\") pod \"nova-cell0-db-create-mtd7h\" (UID: \"387f8115-6262-4bbf-9277-1452a5b29e47\") " pod="openstack/nova-cell0-db-create-mtd7h" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.069018 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzjn5\" (UniqueName: \"kubernetes.io/projected/0bd7c6be-cb25-4e54-9d83-c40520b86d3e-kube-api-access-kzjn5\") pod \"nova-api-db-create-t6f99\" (UID: \"0bd7c6be-cb25-4e54-9d83-c40520b86d3e\") " pod="openstack/nova-api-db-create-t6f99" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.070060 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bd7c6be-cb25-4e54-9d83-c40520b86d3e-operator-scripts\") pod \"nova-api-db-create-t6f99\" (UID: \"0bd7c6be-cb25-4e54-9d83-c40520b86d3e\") " pod="openstack/nova-api-db-create-t6f99" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.116543 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzjn5\" (UniqueName: \"kubernetes.io/projected/0bd7c6be-cb25-4e54-9d83-c40520b86d3e-kube-api-access-kzjn5\") pod \"nova-api-db-create-t6f99\" (UID: \"0bd7c6be-cb25-4e54-9d83-c40520b86d3e\") " pod="openstack/nova-api-db-create-t6f99" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.147041 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5b39-account-create-update-b7fwt"] Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.153851 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mtd7h"] Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.154436 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5b39-account-create-update-b7fwt" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.158934 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.174665 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87mv5\" (UniqueName: \"kubernetes.io/projected/387f8115-6262-4bbf-9277-1452a5b29e47-kube-api-access-87mv5\") pod \"nova-cell0-db-create-mtd7h\" (UID: \"387f8115-6262-4bbf-9277-1452a5b29e47\") " pod="openstack/nova-cell0-db-create-mtd7h" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.175170 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/387f8115-6262-4bbf-9277-1452a5b29e47-operator-scripts\") pod \"nova-cell0-db-create-mtd7h\" (UID: \"387f8115-6262-4bbf-9277-1452a5b29e47\") " pod="openstack/nova-cell0-db-create-mtd7h" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.177014 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/387f8115-6262-4bbf-9277-1452a5b29e47-operator-scripts\") pod \"nova-cell0-db-create-mtd7h\" (UID: \"387f8115-6262-4bbf-9277-1452a5b29e47\") " pod="openstack/nova-cell0-db-create-mtd7h" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.207676 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87mv5\" (UniqueName: \"kubernetes.io/projected/387f8115-6262-4bbf-9277-1452a5b29e47-kube-api-access-87mv5\") pod \"nova-cell0-db-create-mtd7h\" (UID: \"387f8115-6262-4bbf-9277-1452a5b29e47\") " pod="openstack/nova-cell0-db-create-mtd7h" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.225997 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5b39-account-create-update-b7fwt"] Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.267726 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t6f99" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.278871 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9923e811-8726-478a-8357-55d473752028-operator-scripts\") pod \"nova-api-5b39-account-create-update-b7fwt\" (UID: \"9923e811-8726-478a-8357-55d473752028\") " pod="openstack/nova-api-5b39-account-create-update-b7fwt" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.279025 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng4xg\" (UniqueName: \"kubernetes.io/projected/9923e811-8726-478a-8357-55d473752028-kube-api-access-ng4xg\") pod \"nova-api-5b39-account-create-update-b7fwt\" (UID: \"9923e811-8726-478a-8357-55d473752028\") " pod="openstack/nova-api-5b39-account-create-update-b7fwt" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.320036 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wtwjf"] Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.321401 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wtwjf" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.340140 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wtwjf"] Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.353069 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mtd7h" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.372870 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8a46-account-create-update-nh9zc"] Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.384409 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9923e811-8726-478a-8357-55d473752028-operator-scripts\") pod \"nova-api-5b39-account-create-update-b7fwt\" (UID: \"9923e811-8726-478a-8357-55d473752028\") " pod="openstack/nova-api-5b39-account-create-update-b7fwt" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.384532 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45a8958a-c142-475e-b5c6-3d405ad76dda-operator-scripts\") pod \"nova-cell1-db-create-wtwjf\" (UID: \"45a8958a-c142-475e-b5c6-3d405ad76dda\") " pod="openstack/nova-cell1-db-create-wtwjf" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.384570 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b65sw\" (UniqueName: \"kubernetes.io/projected/45a8958a-c142-475e-b5c6-3d405ad76dda-kube-api-access-b65sw\") pod \"nova-cell1-db-create-wtwjf\" (UID: \"45a8958a-c142-475e-b5c6-3d405ad76dda\") " pod="openstack/nova-cell1-db-create-wtwjf" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.384627 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng4xg\" (UniqueName: \"kubernetes.io/projected/9923e811-8726-478a-8357-55d473752028-kube-api-access-ng4xg\") pod \"nova-api-5b39-account-create-update-b7fwt\" (UID: \"9923e811-8726-478a-8357-55d473752028\") " pod="openstack/nova-api-5b39-account-create-update-b7fwt" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.385481 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9923e811-8726-478a-8357-55d473752028-operator-scripts\") pod \"nova-api-5b39-account-create-update-b7fwt\" (UID: \"9923e811-8726-478a-8357-55d473752028\") " pod="openstack/nova-api-5b39-account-create-update-b7fwt" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.400469 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8a46-account-create-update-nh9zc"] Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.400569 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8a46-account-create-update-nh9zc" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.408217 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.448341 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng4xg\" (UniqueName: \"kubernetes.io/projected/9923e811-8726-478a-8357-55d473752028-kube-api-access-ng4xg\") pod \"nova-api-5b39-account-create-update-b7fwt\" (UID: \"9923e811-8726-478a-8357-55d473752028\") " pod="openstack/nova-api-5b39-account-create-update-b7fwt" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.489253 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45a8958a-c142-475e-b5c6-3d405ad76dda-operator-scripts\") pod \"nova-cell1-db-create-wtwjf\" (UID: \"45a8958a-c142-475e-b5c6-3d405ad76dda\") " pod="openstack/nova-cell1-db-create-wtwjf" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.489529 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b65sw\" (UniqueName: \"kubernetes.io/projected/45a8958a-c142-475e-b5c6-3d405ad76dda-kube-api-access-b65sw\") pod \"nova-cell1-db-create-wtwjf\" (UID: \"45a8958a-c142-475e-b5c6-3d405ad76dda\") " pod="openstack/nova-cell1-db-create-wtwjf" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.489554 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j76qh\" (UniqueName: \"kubernetes.io/projected/89a60a11-c3ae-4f01-8397-dd383dc3fc64-kube-api-access-j76qh\") pod \"nova-cell0-8a46-account-create-update-nh9zc\" (UID: \"89a60a11-c3ae-4f01-8397-dd383dc3fc64\") " pod="openstack/nova-cell0-8a46-account-create-update-nh9zc" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.489614 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89a60a11-c3ae-4f01-8397-dd383dc3fc64-operator-scripts\") pod \"nova-cell0-8a46-account-create-update-nh9zc\" (UID: \"89a60a11-c3ae-4f01-8397-dd383dc3fc64\") " pod="openstack/nova-cell0-8a46-account-create-update-nh9zc" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.490438 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45a8958a-c142-475e-b5c6-3d405ad76dda-operator-scripts\") pod \"nova-cell1-db-create-wtwjf\" (UID: \"45a8958a-c142-475e-b5c6-3d405ad76dda\") " pod="openstack/nova-cell1-db-create-wtwjf" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.523562 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b65sw\" (UniqueName: \"kubernetes.io/projected/45a8958a-c142-475e-b5c6-3d405ad76dda-kube-api-access-b65sw\") pod \"nova-cell1-db-create-wtwjf\" (UID: \"45a8958a-c142-475e-b5c6-3d405ad76dda\") " pod="openstack/nova-cell1-db-create-wtwjf" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.558316 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6631-account-create-update-ndfnr"] Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.559447 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6631-account-create-update-ndfnr" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.574932 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.575409 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6631-account-create-update-ndfnr"] Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.601959 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j76qh\" (UniqueName: \"kubernetes.io/projected/89a60a11-c3ae-4f01-8397-dd383dc3fc64-kube-api-access-j76qh\") pod \"nova-cell0-8a46-account-create-update-nh9zc\" (UID: \"89a60a11-c3ae-4f01-8397-dd383dc3fc64\") " pod="openstack/nova-cell0-8a46-account-create-update-nh9zc" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.602042 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6fc275e-ec41-4c09-b451-67136fa617fd-operator-scripts\") pod \"nova-cell1-6631-account-create-update-ndfnr\" (UID: \"d6fc275e-ec41-4c09-b451-67136fa617fd\") " pod="openstack/nova-cell1-6631-account-create-update-ndfnr" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.602061 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89a60a11-c3ae-4f01-8397-dd383dc3fc64-operator-scripts\") pod \"nova-cell0-8a46-account-create-update-nh9zc\" (UID: \"89a60a11-c3ae-4f01-8397-dd383dc3fc64\") " pod="openstack/nova-cell0-8a46-account-create-update-nh9zc" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.602085 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgj4h\" (UniqueName: \"kubernetes.io/projected/d6fc275e-ec41-4c09-b451-67136fa617fd-kube-api-access-wgj4h\") pod \"nova-cell1-6631-account-create-update-ndfnr\" (UID: \"d6fc275e-ec41-4c09-b451-67136fa617fd\") " pod="openstack/nova-cell1-6631-account-create-update-ndfnr" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.608858 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89a60a11-c3ae-4f01-8397-dd383dc3fc64-operator-scripts\") pod \"nova-cell0-8a46-account-create-update-nh9zc\" (UID: \"89a60a11-c3ae-4f01-8397-dd383dc3fc64\") " pod="openstack/nova-cell0-8a46-account-create-update-nh9zc" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.613027 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5b39-account-create-update-b7fwt" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.645044 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j76qh\" (UniqueName: \"kubernetes.io/projected/89a60a11-c3ae-4f01-8397-dd383dc3fc64-kube-api-access-j76qh\") pod \"nova-cell0-8a46-account-create-update-nh9zc\" (UID: \"89a60a11-c3ae-4f01-8397-dd383dc3fc64\") " pod="openstack/nova-cell0-8a46-account-create-update-nh9zc" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.664448 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerID="0008875a2f7ef6e2119165dc1e0e253e98f01735aec210fb18c6ffa1eebbb281" exitCode=0 Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.665194 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerDied","Data":"0008875a2f7ef6e2119165dc1e0e253e98f01735aec210fb18c6ffa1eebbb281"} Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.665269 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"b0b67d8190c897455e564af68d56eb7f7f1eabacada737f44e7b09e47464a936"} Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.665293 4580 scope.go:117] "RemoveContainer" containerID="3ce83f011c377b22dc6fc9c4fe068d2bf2cb580d09b97baaf4fd92fe417cd5eb" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.684046 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wtwjf" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.707795 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6fc275e-ec41-4c09-b451-67136fa617fd-operator-scripts\") pod \"nova-cell1-6631-account-create-update-ndfnr\" (UID: \"d6fc275e-ec41-4c09-b451-67136fa617fd\") " pod="openstack/nova-cell1-6631-account-create-update-ndfnr" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.707839 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgj4h\" (UniqueName: \"kubernetes.io/projected/d6fc275e-ec41-4c09-b451-67136fa617fd-kube-api-access-wgj4h\") pod \"nova-cell1-6631-account-create-update-ndfnr\" (UID: \"d6fc275e-ec41-4c09-b451-67136fa617fd\") " pod="openstack/nova-cell1-6631-account-create-update-ndfnr" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.714355 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6fc275e-ec41-4c09-b451-67136fa617fd-operator-scripts\") pod \"nova-cell1-6631-account-create-update-ndfnr\" (UID: \"d6fc275e-ec41-4c09-b451-67136fa617fd\") " pod="openstack/nova-cell1-6631-account-create-update-ndfnr" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.741159 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgj4h\" (UniqueName: \"kubernetes.io/projected/d6fc275e-ec41-4c09-b451-67136fa617fd-kube-api-access-wgj4h\") pod \"nova-cell1-6631-account-create-update-ndfnr\" (UID: \"d6fc275e-ec41-4c09-b451-67136fa617fd\") " pod="openstack/nova-cell1-6631-account-create-update-ndfnr" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.749123 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8a46-account-create-update-nh9zc" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.935823 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6631-account-create-update-ndfnr" Mar 21 05:14:46 crc kubenswrapper[4580]: I0321 05:14:46.949591 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mtd7h"] Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.034747 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-t6f99"] Mar 21 05:14:47 crc kubenswrapper[4580]: W0321 05:14:47.078338 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bd7c6be_cb25_4e54_9d83_c40520b86d3e.slice/crio-45c4cc589f0003f07d48b61be8625d3e55c47479d991085a5769b94abed236ce WatchSource:0}: Error finding container 45c4cc589f0003f07d48b61be8625d3e55c47479d991085a5769b94abed236ce: Status 404 returned error can't find the container with id 45c4cc589f0003f07d48b61be8625d3e55c47479d991085a5769b94abed236ce Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.243411 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5b39-account-create-update-b7fwt"] Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.515528 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wtwjf"] Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.678434 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5b39-account-create-update-b7fwt" event={"ID":"9923e811-8726-478a-8357-55d473752028","Type":"ContainerStarted","Data":"1145ff06f67b0de3fa2e9a762e4cd67564c504f94ad4bac001be275aa6d4bc17"} Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.678483 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5b39-account-create-update-b7fwt" event={"ID":"9923e811-8726-478a-8357-55d473752028","Type":"ContainerStarted","Data":"6eeff7b355df6eb0570dd9c6e0fb07752d89782fa09e09b4d7502de69a547732"} Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.681078 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wtwjf" event={"ID":"45a8958a-c142-475e-b5c6-3d405ad76dda","Type":"ContainerStarted","Data":"26353f7bb2286d11f0bd44c232658c9ddb01058b3be449642f0ecae8419bf0de"} Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.682161 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mtd7h" event={"ID":"387f8115-6262-4bbf-9277-1452a5b29e47","Type":"ContainerStarted","Data":"8b0e0e7ba8683b5fd23ac9c1cfeca30cf4da6cd0f9ef8dc566c0975e27087bb4"} Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.682186 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mtd7h" event={"ID":"387f8115-6262-4bbf-9277-1452a5b29e47","Type":"ContainerStarted","Data":"9541e9998c3c134a0cd47e2c2dc0ac66e0e30a17da3b29ec0146e938fef47f06"} Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.687827 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t6f99" event={"ID":"0bd7c6be-cb25-4e54-9d83-c40520b86d3e","Type":"ContainerStarted","Data":"797d5aec4bba2ca59b1854fbcc9dcb2f7bccaa718ac9f7b5b34226adf9a55fcf"} Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.687864 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t6f99" event={"ID":"0bd7c6be-cb25-4e54-9d83-c40520b86d3e","Type":"ContainerStarted","Data":"45c4cc589f0003f07d48b61be8625d3e55c47479d991085a5769b94abed236ce"} Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.708409 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-mtd7h" podStartSLOduration=1.708392068 podStartE2EDuration="1.708392068s" podCreationTimestamp="2026-03-21 05:14:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:14:47.706504738 +0000 UTC m=+1392.789088386" watchObservedRunningTime="2026-03-21 05:14:47.708392068 +0000 UTC m=+1392.790975696" Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.722135 4580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.722166 4580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.742465 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-t6f99" podStartSLOduration=2.742447308 podStartE2EDuration="2.742447308s" podCreationTimestamp="2026-03-21 05:14:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:14:47.739206412 +0000 UTC m=+1392.821790040" watchObservedRunningTime="2026-03-21 05:14:47.742447308 +0000 UTC m=+1392.825030936" Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.778032 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6631-account-create-update-ndfnr"] Mar 21 05:14:47 crc kubenswrapper[4580]: I0321 05:14:47.801377 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8a46-account-create-update-nh9zc"] Mar 21 05:14:48 crc kubenswrapper[4580]: I0321 05:14:48.734500 4580 generic.go:334] "Generic (PLEG): container finished" podID="9923e811-8726-478a-8357-55d473752028" containerID="1145ff06f67b0de3fa2e9a762e4cd67564c504f94ad4bac001be275aa6d4bc17" exitCode=0 Mar 21 05:14:48 crc kubenswrapper[4580]: I0321 05:14:48.735476 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5b39-account-create-update-b7fwt" event={"ID":"9923e811-8726-478a-8357-55d473752028","Type":"ContainerDied","Data":"1145ff06f67b0de3fa2e9a762e4cd67564c504f94ad4bac001be275aa6d4bc17"} Mar 21 05:14:48 crc kubenswrapper[4580]: I0321 05:14:48.738729 4580 generic.go:334] "Generic (PLEG): container finished" podID="d6fc275e-ec41-4c09-b451-67136fa617fd" containerID="efb914583c5e717216d083dc481c5adcb0509376eed0a20386e169e9acdebfc7" exitCode=0 Mar 21 05:14:48 crc kubenswrapper[4580]: I0321 05:14:48.738871 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6631-account-create-update-ndfnr" event={"ID":"d6fc275e-ec41-4c09-b451-67136fa617fd","Type":"ContainerDied","Data":"efb914583c5e717216d083dc481c5adcb0509376eed0a20386e169e9acdebfc7"} Mar 21 05:14:48 crc kubenswrapper[4580]: I0321 05:14:48.738899 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6631-account-create-update-ndfnr" event={"ID":"d6fc275e-ec41-4c09-b451-67136fa617fd","Type":"ContainerStarted","Data":"209ca15fb7d4ff92a17fb7cac754f27f1cb62cd02d6a982cc1639333b2dd8463"} Mar 21 05:14:48 crc kubenswrapper[4580]: I0321 05:14:48.744098 4580 generic.go:334] "Generic (PLEG): container finished" podID="45a8958a-c142-475e-b5c6-3d405ad76dda" containerID="bc8c6a6099ae8c2062e398024d430a92d1ff54bb196cf5d1d6ad9b967e3550a4" exitCode=0 Mar 21 05:14:48 crc kubenswrapper[4580]: I0321 05:14:48.744268 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wtwjf" event={"ID":"45a8958a-c142-475e-b5c6-3d405ad76dda","Type":"ContainerDied","Data":"bc8c6a6099ae8c2062e398024d430a92d1ff54bb196cf5d1d6ad9b967e3550a4"} Mar 21 05:14:48 crc kubenswrapper[4580]: I0321 05:14:48.746927 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8a46-account-create-update-nh9zc" event={"ID":"89a60a11-c3ae-4f01-8397-dd383dc3fc64","Type":"ContainerStarted","Data":"12fff88dca4bf339f4df2f7748bd281eb6b5ec069d6370fa0c5d87f4a41ea20b"} Mar 21 05:14:48 crc kubenswrapper[4580]: I0321 05:14:48.747084 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8a46-account-create-update-nh9zc" event={"ID":"89a60a11-c3ae-4f01-8397-dd383dc3fc64","Type":"ContainerStarted","Data":"177a9b15413baa66f527625ce82d4cf4fe220525d92309b6831638f19f403887"} Mar 21 05:14:48 crc kubenswrapper[4580]: I0321 05:14:48.751525 4580 generic.go:334] "Generic (PLEG): container finished" podID="387f8115-6262-4bbf-9277-1452a5b29e47" containerID="8b0e0e7ba8683b5fd23ac9c1cfeca30cf4da6cd0f9ef8dc566c0975e27087bb4" exitCode=0 Mar 21 05:14:48 crc kubenswrapper[4580]: I0321 05:14:48.751595 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mtd7h" event={"ID":"387f8115-6262-4bbf-9277-1452a5b29e47","Type":"ContainerDied","Data":"8b0e0e7ba8683b5fd23ac9c1cfeca30cf4da6cd0f9ef8dc566c0975e27087bb4"} Mar 21 05:14:48 crc kubenswrapper[4580]: I0321 05:14:48.753029 4580 generic.go:334] "Generic (PLEG): container finished" podID="0bd7c6be-cb25-4e54-9d83-c40520b86d3e" containerID="797d5aec4bba2ca59b1854fbcc9dcb2f7bccaa718ac9f7b5b34226adf9a55fcf" exitCode=0 Mar 21 05:14:48 crc kubenswrapper[4580]: I0321 05:14:48.753063 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t6f99" event={"ID":"0bd7c6be-cb25-4e54-9d83-c40520b86d3e","Type":"ContainerDied","Data":"797d5aec4bba2ca59b1854fbcc9dcb2f7bccaa718ac9f7b5b34226adf9a55fcf"} Mar 21 05:14:49 crc kubenswrapper[4580]: I0321 05:14:49.244711 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 05:14:49 crc kubenswrapper[4580]: I0321 05:14:49.245049 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 05:14:49 crc kubenswrapper[4580]: I0321 05:14:49.295770 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 05:14:49 crc kubenswrapper[4580]: I0321 05:14:49.309483 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 05:14:49 crc kubenswrapper[4580]: I0321 05:14:49.764142 4580 generic.go:334] "Generic (PLEG): container finished" podID="89a60a11-c3ae-4f01-8397-dd383dc3fc64" containerID="12fff88dca4bf339f4df2f7748bd281eb6b5ec069d6370fa0c5d87f4a41ea20b" exitCode=0 Mar 21 05:14:49 crc kubenswrapper[4580]: I0321 05:14:49.765270 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8a46-account-create-update-nh9zc" event={"ID":"89a60a11-c3ae-4f01-8397-dd383dc3fc64","Type":"ContainerDied","Data":"12fff88dca4bf339f4df2f7748bd281eb6b5ec069d6370fa0c5d87f4a41ea20b"} Mar 21 05:14:49 crc kubenswrapper[4580]: I0321 05:14:49.765301 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 05:14:49 crc kubenswrapper[4580]: I0321 05:14:49.765708 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.322664 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wtwjf" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.410934 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b65sw\" (UniqueName: \"kubernetes.io/projected/45a8958a-c142-475e-b5c6-3d405ad76dda-kube-api-access-b65sw\") pod \"45a8958a-c142-475e-b5c6-3d405ad76dda\" (UID: \"45a8958a-c142-475e-b5c6-3d405ad76dda\") " Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.411040 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45a8958a-c142-475e-b5c6-3d405ad76dda-operator-scripts\") pod \"45a8958a-c142-475e-b5c6-3d405ad76dda\" (UID: \"45a8958a-c142-475e-b5c6-3d405ad76dda\") " Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.412015 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45a8958a-c142-475e-b5c6-3d405ad76dda-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45a8958a-c142-475e-b5c6-3d405ad76dda" (UID: "45a8958a-c142-475e-b5c6-3d405ad76dda"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.443172 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45a8958a-c142-475e-b5c6-3d405ad76dda-kube-api-access-b65sw" (OuterVolumeSpecName: "kube-api-access-b65sw") pod "45a8958a-c142-475e-b5c6-3d405ad76dda" (UID: "45a8958a-c142-475e-b5c6-3d405ad76dda"). InnerVolumeSpecName "kube-api-access-b65sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.526004 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b65sw\" (UniqueName: \"kubernetes.io/projected/45a8958a-c142-475e-b5c6-3d405ad76dda-kube-api-access-b65sw\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.526034 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45a8958a-c142-475e-b5c6-3d405ad76dda-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.609660 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5b39-account-create-update-b7fwt" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.611485 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8a46-account-create-update-nh9zc" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.625536 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mtd7h" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.656684 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t6f99" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.668218 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6631-account-create-update-ndfnr" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.731360 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87mv5\" (UniqueName: \"kubernetes.io/projected/387f8115-6262-4bbf-9277-1452a5b29e47-kube-api-access-87mv5\") pod \"387f8115-6262-4bbf-9277-1452a5b29e47\" (UID: \"387f8115-6262-4bbf-9277-1452a5b29e47\") " Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.731425 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89a60a11-c3ae-4f01-8397-dd383dc3fc64-operator-scripts\") pod \"89a60a11-c3ae-4f01-8397-dd383dc3fc64\" (UID: \"89a60a11-c3ae-4f01-8397-dd383dc3fc64\") " Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.731460 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/387f8115-6262-4bbf-9277-1452a5b29e47-operator-scripts\") pod \"387f8115-6262-4bbf-9277-1452a5b29e47\" (UID: \"387f8115-6262-4bbf-9277-1452a5b29e47\") " Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.731529 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6fc275e-ec41-4c09-b451-67136fa617fd-operator-scripts\") pod \"d6fc275e-ec41-4c09-b451-67136fa617fd\" (UID: \"d6fc275e-ec41-4c09-b451-67136fa617fd\") " Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.731584 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9923e811-8726-478a-8357-55d473752028-operator-scripts\") pod \"9923e811-8726-478a-8357-55d473752028\" (UID: \"9923e811-8726-478a-8357-55d473752028\") " Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.737705 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a60a11-c3ae-4f01-8397-dd383dc3fc64-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "89a60a11-c3ae-4f01-8397-dd383dc3fc64" (UID: "89a60a11-c3ae-4f01-8397-dd383dc3fc64"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.738490 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/387f8115-6262-4bbf-9277-1452a5b29e47-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "387f8115-6262-4bbf-9277-1452a5b29e47" (UID: "387f8115-6262-4bbf-9277-1452a5b29e47"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.739671 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6fc275e-ec41-4c09-b451-67136fa617fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6fc275e-ec41-4c09-b451-67136fa617fd" (UID: "d6fc275e-ec41-4c09-b451-67136fa617fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.740576 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzjn5\" (UniqueName: \"kubernetes.io/projected/0bd7c6be-cb25-4e54-9d83-c40520b86d3e-kube-api-access-kzjn5\") pod \"0bd7c6be-cb25-4e54-9d83-c40520b86d3e\" (UID: \"0bd7c6be-cb25-4e54-9d83-c40520b86d3e\") " Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.740752 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgj4h\" (UniqueName: \"kubernetes.io/projected/d6fc275e-ec41-4c09-b451-67136fa617fd-kube-api-access-wgj4h\") pod \"d6fc275e-ec41-4c09-b451-67136fa617fd\" (UID: \"d6fc275e-ec41-4c09-b451-67136fa617fd\") " Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.740811 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng4xg\" (UniqueName: \"kubernetes.io/projected/9923e811-8726-478a-8357-55d473752028-kube-api-access-ng4xg\") pod \"9923e811-8726-478a-8357-55d473752028\" (UID: \"9923e811-8726-478a-8357-55d473752028\") " Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.740850 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bd7c6be-cb25-4e54-9d83-c40520b86d3e-operator-scripts\") pod \"0bd7c6be-cb25-4e54-9d83-c40520b86d3e\" (UID: \"0bd7c6be-cb25-4e54-9d83-c40520b86d3e\") " Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.740906 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j76qh\" (UniqueName: \"kubernetes.io/projected/89a60a11-c3ae-4f01-8397-dd383dc3fc64-kube-api-access-j76qh\") pod \"89a60a11-c3ae-4f01-8397-dd383dc3fc64\" (UID: \"89a60a11-c3ae-4f01-8397-dd383dc3fc64\") " Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.744891 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9923e811-8726-478a-8357-55d473752028-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9923e811-8726-478a-8357-55d473752028" (UID: "9923e811-8726-478a-8357-55d473752028"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.745773 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bd7c6be-cb25-4e54-9d83-c40520b86d3e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0bd7c6be-cb25-4e54-9d83-c40520b86d3e" (UID: "0bd7c6be-cb25-4e54-9d83-c40520b86d3e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.752388 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89a60a11-c3ae-4f01-8397-dd383dc3fc64-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.752439 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/387f8115-6262-4bbf-9277-1452a5b29e47-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.752456 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6fc275e-ec41-4c09-b451-67136fa617fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.758490 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/387f8115-6262-4bbf-9277-1452a5b29e47-kube-api-access-87mv5" (OuterVolumeSpecName: "kube-api-access-87mv5") pod "387f8115-6262-4bbf-9277-1452a5b29e47" (UID: "387f8115-6262-4bbf-9277-1452a5b29e47"). InnerVolumeSpecName "kube-api-access-87mv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.759579 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bd7c6be-cb25-4e54-9d83-c40520b86d3e-kube-api-access-kzjn5" (OuterVolumeSpecName: "kube-api-access-kzjn5") pod "0bd7c6be-cb25-4e54-9d83-c40520b86d3e" (UID: "0bd7c6be-cb25-4e54-9d83-c40520b86d3e"). InnerVolumeSpecName "kube-api-access-kzjn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.760015 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6fc275e-ec41-4c09-b451-67136fa617fd-kube-api-access-wgj4h" (OuterVolumeSpecName: "kube-api-access-wgj4h") pod "d6fc275e-ec41-4c09-b451-67136fa617fd" (UID: "d6fc275e-ec41-4c09-b451-67136fa617fd"). InnerVolumeSpecName "kube-api-access-wgj4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.764897 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89a60a11-c3ae-4f01-8397-dd383dc3fc64-kube-api-access-j76qh" (OuterVolumeSpecName: "kube-api-access-j76qh") pod "89a60a11-c3ae-4f01-8397-dd383dc3fc64" (UID: "89a60a11-c3ae-4f01-8397-dd383dc3fc64"). InnerVolumeSpecName "kube-api-access-j76qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.805375 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9923e811-8726-478a-8357-55d473752028-kube-api-access-ng4xg" (OuterVolumeSpecName: "kube-api-access-ng4xg") pod "9923e811-8726-478a-8357-55d473752028" (UID: "9923e811-8726-478a-8357-55d473752028"). InnerVolumeSpecName "kube-api-access-ng4xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.821673 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wtwjf" event={"ID":"45a8958a-c142-475e-b5c6-3d405ad76dda","Type":"ContainerDied","Data":"26353f7bb2286d11f0bd44c232658c9ddb01058b3be449642f0ecae8419bf0de"} Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.821713 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26353f7bb2286d11f0bd44c232658c9ddb01058b3be449642f0ecae8419bf0de" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.821821 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wtwjf" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.830285 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8a46-account-create-update-nh9zc" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.830725 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8a46-account-create-update-nh9zc" event={"ID":"89a60a11-c3ae-4f01-8397-dd383dc3fc64","Type":"ContainerDied","Data":"177a9b15413baa66f527625ce82d4cf4fe220525d92309b6831638f19f403887"} Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.830802 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="177a9b15413baa66f527625ce82d4cf4fe220525d92309b6831638f19f403887" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.833396 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mtd7h" event={"ID":"387f8115-6262-4bbf-9277-1452a5b29e47","Type":"ContainerDied","Data":"9541e9998c3c134a0cd47e2c2dc0ac66e0e30a17da3b29ec0146e938fef47f06"} Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.833430 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9541e9998c3c134a0cd47e2c2dc0ac66e0e30a17da3b29ec0146e938fef47f06" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.833498 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mtd7h" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.855211 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t6f99" event={"ID":"0bd7c6be-cb25-4e54-9d83-c40520b86d3e","Type":"ContainerDied","Data":"45c4cc589f0003f07d48b61be8625d3e55c47479d991085a5769b94abed236ce"} Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.855255 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45c4cc589f0003f07d48b61be8625d3e55c47479d991085a5769b94abed236ce" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.855224 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t6f99" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.856255 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9923e811-8726-478a-8357-55d473752028-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.856290 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzjn5\" (UniqueName: \"kubernetes.io/projected/0bd7c6be-cb25-4e54-9d83-c40520b86d3e-kube-api-access-kzjn5\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.856304 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng4xg\" (UniqueName: \"kubernetes.io/projected/9923e811-8726-478a-8357-55d473752028-kube-api-access-ng4xg\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.856316 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgj4h\" (UniqueName: \"kubernetes.io/projected/d6fc275e-ec41-4c09-b451-67136fa617fd-kube-api-access-wgj4h\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.856328 4580 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bd7c6be-cb25-4e54-9d83-c40520b86d3e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.856338 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j76qh\" (UniqueName: \"kubernetes.io/projected/89a60a11-c3ae-4f01-8397-dd383dc3fc64-kube-api-access-j76qh\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.856350 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87mv5\" (UniqueName: \"kubernetes.io/projected/387f8115-6262-4bbf-9277-1452a5b29e47-kube-api-access-87mv5\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.871145 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5b39-account-create-update-b7fwt" event={"ID":"9923e811-8726-478a-8357-55d473752028","Type":"ContainerDied","Data":"6eeff7b355df6eb0570dd9c6e0fb07752d89782fa09e09b4d7502de69a547732"} Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.871191 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eeff7b355df6eb0570dd9c6e0fb07752d89782fa09e09b4d7502de69a547732" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.871290 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5b39-account-create-update-b7fwt" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.883683 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6631-account-create-update-ndfnr" event={"ID":"d6fc275e-ec41-4c09-b451-67136fa617fd","Type":"ContainerDied","Data":"209ca15fb7d4ff92a17fb7cac754f27f1cb62cd02d6a982cc1639333b2dd8463"} Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.883731 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="209ca15fb7d4ff92a17fb7cac754f27f1cb62cd02d6a982cc1639333b2dd8463" Mar 21 05:14:50 crc kubenswrapper[4580]: I0321 05:14:50.883987 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6631-account-create-update-ndfnr" Mar 21 05:14:51 crc kubenswrapper[4580]: I0321 05:14:51.394218 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:14:51 crc kubenswrapper[4580]: I0321 05:14:51.394557 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:14:51 crc kubenswrapper[4580]: I0321 05:14:51.395328 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"f8ab4ef90bd31d20c6033eb943d9a0a9a88a0d10339df1ff4a08e1b1232fe783"} pod="openstack/horizon-587cfc8688-265kc" containerMessage="Container horizon failed startup probe, will be restarted" Mar 21 05:14:51 crc kubenswrapper[4580]: I0321 05:14:51.395369 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" containerID="cri-o://f8ab4ef90bd31d20c6033eb943d9a0a9a88a0d10339df1ff4a08e1b1232fe783" gracePeriod=30 Mar 21 05:14:51 crc kubenswrapper[4580]: I0321 05:14:51.507209 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 21 05:14:51 crc kubenswrapper[4580]: I0321 05:14:51.507280 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:14:51 crc kubenswrapper[4580]: I0321 05:14:51.508050 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"b1910d7dc39d75c560d1ecb55908d0c4f510cbbee17323265da8706ab45dadba"} pod="openstack/horizon-67655f8b6-mbx6n" containerMessage="Container horizon failed startup probe, will be restarted" Mar 21 05:14:51 crc kubenswrapper[4580]: I0321 05:14:51.508108 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" containerID="cri-o://b1910d7dc39d75c560d1ecb55908d0c4f510cbbee17323265da8706ab45dadba" gracePeriod=30 Mar 21 05:14:51 crc kubenswrapper[4580]: I0321 05:14:51.892530 4580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 05:14:51 crc kubenswrapper[4580]: I0321 05:14:51.892553 4580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 05:14:52 crc kubenswrapper[4580]: I0321 05:14:52.734091 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 05:14:53 crc kubenswrapper[4580]: I0321 05:14:53.060255 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 05:14:53 crc kubenswrapper[4580]: I0321 05:14:53.061153 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 05:14:54 crc kubenswrapper[4580]: I0321 05:14:54.735585 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 05:14:54 crc kubenswrapper[4580]: I0321 05:14:54.736152 4580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 05:14:54 crc kubenswrapper[4580]: I0321 05:14:54.749010 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.565265 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vsxgp"] Mar 21 05:14:56 crc kubenswrapper[4580]: E0321 05:14:56.565644 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9923e811-8726-478a-8357-55d473752028" containerName="mariadb-account-create-update" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.565657 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9923e811-8726-478a-8357-55d473752028" containerName="mariadb-account-create-update" Mar 21 05:14:56 crc kubenswrapper[4580]: E0321 05:14:56.565667 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a8958a-c142-475e-b5c6-3d405ad76dda" containerName="mariadb-database-create" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.565674 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a8958a-c142-475e-b5c6-3d405ad76dda" containerName="mariadb-database-create" Mar 21 05:14:56 crc kubenswrapper[4580]: E0321 05:14:56.565693 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fc275e-ec41-4c09-b451-67136fa617fd" containerName="mariadb-account-create-update" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.565699 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fc275e-ec41-4c09-b451-67136fa617fd" containerName="mariadb-account-create-update" Mar 21 05:14:56 crc kubenswrapper[4580]: E0321 05:14:56.565718 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bd7c6be-cb25-4e54-9d83-c40520b86d3e" containerName="mariadb-database-create" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.565724 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd7c6be-cb25-4e54-9d83-c40520b86d3e" containerName="mariadb-database-create" Mar 21 05:14:56 crc kubenswrapper[4580]: E0321 05:14:56.565743 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a60a11-c3ae-4f01-8397-dd383dc3fc64" containerName="mariadb-account-create-update" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.565749 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a60a11-c3ae-4f01-8397-dd383dc3fc64" containerName="mariadb-account-create-update" Mar 21 05:14:56 crc kubenswrapper[4580]: E0321 05:14:56.565756 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="387f8115-6262-4bbf-9277-1452a5b29e47" containerName="mariadb-database-create" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.565761 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="387f8115-6262-4bbf-9277-1452a5b29e47" containerName="mariadb-database-create" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.566033 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a60a11-c3ae-4f01-8397-dd383dc3fc64" containerName="mariadb-account-create-update" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.566047 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bd7c6be-cb25-4e54-9d83-c40520b86d3e" containerName="mariadb-database-create" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.566057 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="387f8115-6262-4bbf-9277-1452a5b29e47" containerName="mariadb-database-create" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.566071 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6fc275e-ec41-4c09-b451-67136fa617fd" containerName="mariadb-account-create-update" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.566082 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="9923e811-8726-478a-8357-55d473752028" containerName="mariadb-account-create-update" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.566095 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="45a8958a-c142-475e-b5c6-3d405ad76dda" containerName="mariadb-database-create" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.566667 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.570147 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.570697 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gj22l" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.570881 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.580672 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vsxgp"] Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.687313 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-config-data\") pod \"nova-cell0-conductor-db-sync-vsxgp\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.687457 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x5kw\" (UniqueName: \"kubernetes.io/projected/d2804338-19fa-40f9-9945-23cabe223f46-kube-api-access-9x5kw\") pod \"nova-cell0-conductor-db-sync-vsxgp\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.687506 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vsxgp\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.687661 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-scripts\") pod \"nova-cell0-conductor-db-sync-vsxgp\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.789793 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-scripts\") pod \"nova-cell0-conductor-db-sync-vsxgp\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.789980 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-config-data\") pod \"nova-cell0-conductor-db-sync-vsxgp\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.790124 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x5kw\" (UniqueName: \"kubernetes.io/projected/d2804338-19fa-40f9-9945-23cabe223f46-kube-api-access-9x5kw\") pod \"nova-cell0-conductor-db-sync-vsxgp\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.790189 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vsxgp\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.798397 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vsxgp\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.798996 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-config-data\") pod \"nova-cell0-conductor-db-sync-vsxgp\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.799293 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-scripts\") pod \"nova-cell0-conductor-db-sync-vsxgp\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.830436 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x5kw\" (UniqueName: \"kubernetes.io/projected/d2804338-19fa-40f9-9945-23cabe223f46-kube-api-access-9x5kw\") pod \"nova-cell0-conductor-db-sync-vsxgp\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:14:56 crc kubenswrapper[4580]: I0321 05:14:56.892076 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:14:57 crc kubenswrapper[4580]: I0321 05:14:57.584190 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vsxgp"] Mar 21 05:14:57 crc kubenswrapper[4580]: I0321 05:14:57.945509 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vsxgp" event={"ID":"d2804338-19fa-40f9-9945-23cabe223f46","Type":"ContainerStarted","Data":"b4b22c6ba1ca3021c6406d6d8137df67b00e20357eb26afb58670fec14d1be42"} Mar 21 05:14:59 crc kubenswrapper[4580]: I0321 05:14:59.979498 4580 generic.go:334] "Generic (PLEG): container finished" podID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerID="6d95b87e34ca007e4598f9fa298e093ee45b4e68ec1b3518928111cc68259c1a" exitCode=137 Mar 21 05:14:59 crc kubenswrapper[4580]: I0321 05:14:59.979865 4580 generic.go:334] "Generic (PLEG): container finished" podID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerID="ec6a90feef6df9210a56ba0234c2766bf2f184c0f2c9fc8765c4f5b30480a682" exitCode=137 Mar 21 05:14:59 crc kubenswrapper[4580]: I0321 05:14:59.979583 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4","Type":"ContainerDied","Data":"6d95b87e34ca007e4598f9fa298e093ee45b4e68ec1b3518928111cc68259c1a"} Mar 21 05:14:59 crc kubenswrapper[4580]: I0321 05:14:59.979910 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4","Type":"ContainerDied","Data":"ec6a90feef6df9210a56ba0234c2766bf2f184c0f2c9fc8765c4f5b30480a682"} Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.161596 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7"] Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.163369 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.170588 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7"] Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.178355 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.178683 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.287742 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa0c7395-a399-416c-aa03-231a27008b0d-secret-volume\") pod \"collect-profiles-29567835-88td7\" (UID: \"fa0c7395-a399-416c-aa03-231a27008b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.287818 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa0c7395-a399-416c-aa03-231a27008b0d-config-volume\") pod \"collect-profiles-29567835-88td7\" (UID: \"fa0c7395-a399-416c-aa03-231a27008b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.287869 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwjnn\" (UniqueName: \"kubernetes.io/projected/fa0c7395-a399-416c-aa03-231a27008b0d-kube-api-access-xwjnn\") pod \"collect-profiles-29567835-88td7\" (UID: \"fa0c7395-a399-416c-aa03-231a27008b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.343846 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.393637 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa0c7395-a399-416c-aa03-231a27008b0d-secret-volume\") pod \"collect-profiles-29567835-88td7\" (UID: \"fa0c7395-a399-416c-aa03-231a27008b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.393713 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa0c7395-a399-416c-aa03-231a27008b0d-config-volume\") pod \"collect-profiles-29567835-88td7\" (UID: \"fa0c7395-a399-416c-aa03-231a27008b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.393761 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwjnn\" (UniqueName: \"kubernetes.io/projected/fa0c7395-a399-416c-aa03-231a27008b0d-kube-api-access-xwjnn\") pod \"collect-profiles-29567835-88td7\" (UID: \"fa0c7395-a399-416c-aa03-231a27008b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.396002 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa0c7395-a399-416c-aa03-231a27008b0d-config-volume\") pod \"collect-profiles-29567835-88td7\" (UID: \"fa0c7395-a399-416c-aa03-231a27008b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.438245 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwjnn\" (UniqueName: \"kubernetes.io/projected/fa0c7395-a399-416c-aa03-231a27008b0d-kube-api-access-xwjnn\") pod \"collect-profiles-29567835-88td7\" (UID: \"fa0c7395-a399-416c-aa03-231a27008b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.438626 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa0c7395-a399-416c-aa03-231a27008b0d-secret-volume\") pod \"collect-profiles-29567835-88td7\" (UID: \"fa0c7395-a399-416c-aa03-231a27008b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.498152 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-run-httpd\") pod \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.499022 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-log-httpd\") pod \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.499158 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-config-data\") pod \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.499329 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-combined-ca-bundle\") pod \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.499422 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-sg-core-conf-yaml\") pod \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.499548 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97d42\" (UniqueName: \"kubernetes.io/projected/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-kube-api-access-97d42\") pod \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.499698 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-scripts\") pod \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\" (UID: \"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4\") " Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.498967 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" (UID: "7da7ef3b-3524-42a5-b8f4-132b7e6a08e4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.501208 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" (UID: "7da7ef3b-3524-42a5-b8f4-132b7e6a08e4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.509850 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-scripts" (OuterVolumeSpecName: "scripts") pod "7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" (UID: "7da7ef3b-3524-42a5-b8f4-132b7e6a08e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.510593 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.518752 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-kube-api-access-97d42" (OuterVolumeSpecName: "kube-api-access-97d42") pod "7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" (UID: "7da7ef3b-3524-42a5-b8f4-132b7e6a08e4"). InnerVolumeSpecName "kube-api-access-97d42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.542679 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" (UID: "7da7ef3b-3524-42a5-b8f4-132b7e6a08e4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.610935 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.611550 4580 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.611649 4580 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.611818 4580 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.611937 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97d42\" (UniqueName: \"kubernetes.io/projected/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-kube-api-access-97d42\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.653532 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" (UID: "7da7ef3b-3524-42a5-b8f4-132b7e6a08e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.713771 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.812423 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-config-data" (OuterVolumeSpecName: "config-data") pod "7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" (UID: "7da7ef3b-3524-42a5-b8f4-132b7e6a08e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:00 crc kubenswrapper[4580]: I0321 05:15:00.819019 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.006546 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7da7ef3b-3524-42a5-b8f4-132b7e6a08e4","Type":"ContainerDied","Data":"da60e517c145c3d6544dede0d54495ae89e326f2c0cabb3e9e534a812902a5fe"} Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.006617 4580 scope.go:117] "RemoveContainer" containerID="06d40532a000f5070c7b72a587cc49ac7da9bf2b5aaac16aab263f4f9950bff0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.006668 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.053145 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.061324 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.080025 4580 scope.go:117] "RemoveContainer" containerID="1d12cac99e2250bd20d1b9eda54fccc6481e73ae9673a65c6438d14333d09658" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.087672 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:15:01 crc kubenswrapper[4580]: E0321 05:15:01.088372 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerName="ceilometer-central-agent" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.088393 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerName="ceilometer-central-agent" Mar 21 05:15:01 crc kubenswrapper[4580]: E0321 05:15:01.088442 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerName="sg-core" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.088455 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerName="sg-core" Mar 21 05:15:01 crc kubenswrapper[4580]: E0321 05:15:01.088471 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerName="proxy-httpd" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.088479 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerName="proxy-httpd" Mar 21 05:15:01 crc kubenswrapper[4580]: E0321 05:15:01.088507 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerName="ceilometer-notification-agent" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.088514 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerName="ceilometer-notification-agent" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.088694 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerName="ceilometer-notification-agent" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.088715 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerName="sg-core" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.088724 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerName="proxy-httpd" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.088736 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" containerName="ceilometer-central-agent" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.091664 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.098598 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.099057 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.100181 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.119365 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7"] Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.126936 4580 scope.go:117] "RemoveContainer" containerID="6d95b87e34ca007e4598f9fa298e093ee45b4e68ec1b3518928111cc68259c1a" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.143273 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.217169 4580 scope.go:117] "RemoveContainer" containerID="ec6a90feef6df9210a56ba0234c2766bf2f184c0f2c9fc8765c4f5b30480a682" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.228140 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.228198 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-scripts\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.228252 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-config-data\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.228288 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7d379b-3471-4612-8b97-f684dbab8e8e-log-httpd\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.228359 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkxrn\" (UniqueName: \"kubernetes.io/projected/dc7d379b-3471-4612-8b97-f684dbab8e8e-kube-api-access-qkxrn\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.228402 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.228424 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.228451 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7d379b-3471-4612-8b97-f684dbab8e8e-run-httpd\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.330304 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-scripts\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.330374 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-config-data\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.330411 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7d379b-3471-4612-8b97-f684dbab8e8e-log-httpd\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.330470 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkxrn\" (UniqueName: \"kubernetes.io/projected/dc7d379b-3471-4612-8b97-f684dbab8e8e-kube-api-access-qkxrn\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.330498 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.330518 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.330543 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7d379b-3471-4612-8b97-f684dbab8e8e-run-httpd\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.330596 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.331471 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7d379b-3471-4612-8b97-f684dbab8e8e-log-httpd\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.331561 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7d379b-3471-4612-8b97-f684dbab8e8e-run-httpd\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.336405 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.337132 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-scripts\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.341209 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.342584 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.345447 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-config-data\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.357858 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkxrn\" (UniqueName: \"kubernetes.io/projected/dc7d379b-3471-4612-8b97-f684dbab8e8e-kube-api-access-qkxrn\") pod \"ceilometer-0\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.420433 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:15:01 crc kubenswrapper[4580]: I0321 05:15:01.643095 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7da7ef3b-3524-42a5-b8f4-132b7e6a08e4" path="/var/lib/kubelet/pods/7da7ef3b-3524-42a5-b8f4-132b7e6a08e4/volumes" Mar 21 05:15:02 crc kubenswrapper[4580]: I0321 05:15:02.008012 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:15:02 crc kubenswrapper[4580]: I0321 05:15:02.065543 4580 generic.go:334] "Generic (PLEG): container finished" podID="fa0c7395-a399-416c-aa03-231a27008b0d" containerID="87faedf35ac6f3b2468e9c18e4131308b29fbe42705baf3e14f1863e821fcd21" exitCode=0 Mar 21 05:15:02 crc kubenswrapper[4580]: I0321 05:15:02.065629 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" event={"ID":"fa0c7395-a399-416c-aa03-231a27008b0d","Type":"ContainerDied","Data":"87faedf35ac6f3b2468e9c18e4131308b29fbe42705baf3e14f1863e821fcd21"} Mar 21 05:15:02 crc kubenswrapper[4580]: I0321 05:15:02.065662 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" event={"ID":"fa0c7395-a399-416c-aa03-231a27008b0d","Type":"ContainerStarted","Data":"f4dd4abaf04e3409d78be164b9226508d8e0c701f464352d9628deda7d984949"} Mar 21 05:15:03 crc kubenswrapper[4580]: I0321 05:15:03.093278 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7d379b-3471-4612-8b97-f684dbab8e8e","Type":"ContainerStarted","Data":"c5c0a8af7dab76dec7e5c81939bb01d08d6a975f0b2d8d62eaf2d92e85b4939b"} Mar 21 05:15:07 crc kubenswrapper[4580]: I0321 05:15:07.174763 4580 scope.go:117] "RemoveContainer" containerID="71f777cfaf78023adc21f19c8de59042d08f29ef677ffe40192603ca913efe27" Mar 21 05:15:08 crc kubenswrapper[4580]: I0321 05:15:08.613646 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" Mar 21 05:15:08 crc kubenswrapper[4580]: I0321 05:15:08.790824 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwjnn\" (UniqueName: \"kubernetes.io/projected/fa0c7395-a399-416c-aa03-231a27008b0d-kube-api-access-xwjnn\") pod \"fa0c7395-a399-416c-aa03-231a27008b0d\" (UID: \"fa0c7395-a399-416c-aa03-231a27008b0d\") " Mar 21 05:15:08 crc kubenswrapper[4580]: I0321 05:15:08.791285 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa0c7395-a399-416c-aa03-231a27008b0d-config-volume\") pod \"fa0c7395-a399-416c-aa03-231a27008b0d\" (UID: \"fa0c7395-a399-416c-aa03-231a27008b0d\") " Mar 21 05:15:08 crc kubenswrapper[4580]: I0321 05:15:08.791381 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa0c7395-a399-416c-aa03-231a27008b0d-secret-volume\") pod \"fa0c7395-a399-416c-aa03-231a27008b0d\" (UID: \"fa0c7395-a399-416c-aa03-231a27008b0d\") " Mar 21 05:15:08 crc kubenswrapper[4580]: I0321 05:15:08.792845 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa0c7395-a399-416c-aa03-231a27008b0d-config-volume" (OuterVolumeSpecName: "config-volume") pod "fa0c7395-a399-416c-aa03-231a27008b0d" (UID: "fa0c7395-a399-416c-aa03-231a27008b0d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:15:08 crc kubenswrapper[4580]: I0321 05:15:08.796195 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0c7395-a399-416c-aa03-231a27008b0d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fa0c7395-a399-416c-aa03-231a27008b0d" (UID: "fa0c7395-a399-416c-aa03-231a27008b0d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:08 crc kubenswrapper[4580]: I0321 05:15:08.797494 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa0c7395-a399-416c-aa03-231a27008b0d-kube-api-access-xwjnn" (OuterVolumeSpecName: "kube-api-access-xwjnn") pod "fa0c7395-a399-416c-aa03-231a27008b0d" (UID: "fa0c7395-a399-416c-aa03-231a27008b0d"). InnerVolumeSpecName "kube-api-access-xwjnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:15:08 crc kubenswrapper[4580]: I0321 05:15:08.893711 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwjnn\" (UniqueName: \"kubernetes.io/projected/fa0c7395-a399-416c-aa03-231a27008b0d-kube-api-access-xwjnn\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:08 crc kubenswrapper[4580]: I0321 05:15:08.893743 4580 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa0c7395-a399-416c-aa03-231a27008b0d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:08 crc kubenswrapper[4580]: I0321 05:15:08.893755 4580 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa0c7395-a399-416c-aa03-231a27008b0d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:09 crc kubenswrapper[4580]: I0321 05:15:09.156976 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7d379b-3471-4612-8b97-f684dbab8e8e","Type":"ContainerStarted","Data":"4ca3246167dcb1c76c504c98bd77be031d6cc2e005a0b8367694aa5c990a190d"} Mar 21 05:15:09 crc kubenswrapper[4580]: I0321 05:15:09.158997 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" Mar 21 05:15:09 crc kubenswrapper[4580]: I0321 05:15:09.159293 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7" event={"ID":"fa0c7395-a399-416c-aa03-231a27008b0d","Type":"ContainerDied","Data":"f4dd4abaf04e3409d78be164b9226508d8e0c701f464352d9628deda7d984949"} Mar 21 05:15:09 crc kubenswrapper[4580]: I0321 05:15:09.159323 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4dd4abaf04e3409d78be164b9226508d8e0c701f464352d9628deda7d984949" Mar 21 05:15:09 crc kubenswrapper[4580]: I0321 05:15:09.161957 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vsxgp" event={"ID":"d2804338-19fa-40f9-9945-23cabe223f46","Type":"ContainerStarted","Data":"e1e71939e2eba028b0abb4f53083b37edb6d7a87fb7124ab1f067b21bbb4c602"} Mar 21 05:15:09 crc kubenswrapper[4580]: I0321 05:15:09.187870 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-vsxgp" podStartSLOduration=2.166040678 podStartE2EDuration="13.18785159s" podCreationTimestamp="2026-03-21 05:14:56 +0000 UTC" firstStartedPulling="2026-03-21 05:14:57.596375789 +0000 UTC m=+1402.678959407" lastFinishedPulling="2026-03-21 05:15:08.618186691 +0000 UTC m=+1413.700770319" observedRunningTime="2026-03-21 05:15:09.186906385 +0000 UTC m=+1414.269490013" watchObservedRunningTime="2026-03-21 05:15:09.18785159 +0000 UTC m=+1414.270435218" Mar 21 05:15:10 crc kubenswrapper[4580]: I0321 05:15:10.172576 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7d379b-3471-4612-8b97-f684dbab8e8e","Type":"ContainerStarted","Data":"be4fa2b8f0d4118ec08cec90330732da336f4d1d2e7858ede3fb865b1a5c6688"} Mar 21 05:15:11 crc kubenswrapper[4580]: I0321 05:15:11.184756 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7d379b-3471-4612-8b97-f684dbab8e8e","Type":"ContainerStarted","Data":"2dd91e2dd9585cb6ba9b33282cb1ecfb8942549a5d89bd25c04a64043772178f"} Mar 21 05:15:13 crc kubenswrapper[4580]: I0321 05:15:13.205610 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7d379b-3471-4612-8b97-f684dbab8e8e","Type":"ContainerStarted","Data":"24196a49e4971d8d2b205ebc5a78d8102c8e522db828595d6203623ddfdf1125"} Mar 21 05:15:13 crc kubenswrapper[4580]: I0321 05:15:13.206457 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 05:15:13 crc kubenswrapper[4580]: I0321 05:15:13.232474 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.956768926 podStartE2EDuration="12.232458864s" podCreationTimestamp="2026-03-21 05:15:01 +0000 UTC" firstStartedPulling="2026-03-21 05:15:02.022074934 +0000 UTC m=+1407.104658562" lastFinishedPulling="2026-03-21 05:15:12.297764872 +0000 UTC m=+1417.380348500" observedRunningTime="2026-03-21 05:15:13.23193758 +0000 UTC m=+1418.314521238" watchObservedRunningTime="2026-03-21 05:15:13.232458864 +0000 UTC m=+1418.315042492" Mar 21 05:15:15 crc kubenswrapper[4580]: I0321 05:15:15.388597 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:15:15 crc kubenswrapper[4580]: I0321 05:15:15.389297 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="ceilometer-central-agent" containerID="cri-o://4ca3246167dcb1c76c504c98bd77be031d6cc2e005a0b8367694aa5c990a190d" gracePeriod=30 Mar 21 05:15:15 crc kubenswrapper[4580]: I0321 05:15:15.389485 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="proxy-httpd" containerID="cri-o://24196a49e4971d8d2b205ebc5a78d8102c8e522db828595d6203623ddfdf1125" gracePeriod=30 Mar 21 05:15:15 crc kubenswrapper[4580]: I0321 05:15:15.389511 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="ceilometer-notification-agent" containerID="cri-o://be4fa2b8f0d4118ec08cec90330732da336f4d1d2e7858ede3fb865b1a5c6688" gracePeriod=30 Mar 21 05:15:15 crc kubenswrapper[4580]: I0321 05:15:15.389622 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="sg-core" containerID="cri-o://2dd91e2dd9585cb6ba9b33282cb1ecfb8942549a5d89bd25c04a64043772178f" gracePeriod=30 Mar 21 05:15:16 crc kubenswrapper[4580]: I0321 05:15:16.232842 4580 generic.go:334] "Generic (PLEG): container finished" podID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerID="24196a49e4971d8d2b205ebc5a78d8102c8e522db828595d6203623ddfdf1125" exitCode=0 Mar 21 05:15:16 crc kubenswrapper[4580]: I0321 05:15:16.232872 4580 generic.go:334] "Generic (PLEG): container finished" podID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerID="2dd91e2dd9585cb6ba9b33282cb1ecfb8942549a5d89bd25c04a64043772178f" exitCode=2 Mar 21 05:15:16 crc kubenswrapper[4580]: I0321 05:15:16.232881 4580 generic.go:334] "Generic (PLEG): container finished" podID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerID="be4fa2b8f0d4118ec08cec90330732da336f4d1d2e7858ede3fb865b1a5c6688" exitCode=0 Mar 21 05:15:16 crc kubenswrapper[4580]: I0321 05:15:16.232897 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7d379b-3471-4612-8b97-f684dbab8e8e","Type":"ContainerDied","Data":"24196a49e4971d8d2b205ebc5a78d8102c8e522db828595d6203623ddfdf1125"} Mar 21 05:15:16 crc kubenswrapper[4580]: I0321 05:15:16.232937 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7d379b-3471-4612-8b97-f684dbab8e8e","Type":"ContainerDied","Data":"2dd91e2dd9585cb6ba9b33282cb1ecfb8942549a5d89bd25c04a64043772178f"} Mar 21 05:15:16 crc kubenswrapper[4580]: I0321 05:15:16.232949 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7d379b-3471-4612-8b97-f684dbab8e8e","Type":"ContainerDied","Data":"be4fa2b8f0d4118ec08cec90330732da336f4d1d2e7858ede3fb865b1a5c6688"} Mar 21 05:15:22 crc kubenswrapper[4580]: I0321 05:15:22.329315 4580 generic.go:334] "Generic (PLEG): container finished" podID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerID="4ca3246167dcb1c76c504c98bd77be031d6cc2e005a0b8367694aa5c990a190d" exitCode=0 Mar 21 05:15:22 crc kubenswrapper[4580]: I0321 05:15:22.329384 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7d379b-3471-4612-8b97-f684dbab8e8e","Type":"ContainerDied","Data":"4ca3246167dcb1c76c504c98bd77be031d6cc2e005a0b8367694aa5c990a190d"} Mar 21 05:15:22 crc kubenswrapper[4580]: I0321 05:15:22.332390 4580 generic.go:334] "Generic (PLEG): container finished" podID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerID="b1910d7dc39d75c560d1ecb55908d0c4f510cbbee17323265da8706ab45dadba" exitCode=137 Mar 21 05:15:22 crc kubenswrapper[4580]: I0321 05:15:22.332446 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67655f8b6-mbx6n" event={"ID":"a03ce0fa-f7e8-4b48-bbea-95807f14dd26","Type":"ContainerDied","Data":"b1910d7dc39d75c560d1ecb55908d0c4f510cbbee17323265da8706ab45dadba"} Mar 21 05:15:22 crc kubenswrapper[4580]: I0321 05:15:22.332476 4580 scope.go:117] "RemoveContainer" containerID="b03c7b6b3d34260bff0a00bc798a52da6836ea0ee76b7c6df6980b8c29af49eb" Mar 21 05:15:22 crc kubenswrapper[4580]: I0321 05:15:22.337703 4580 generic.go:334] "Generic (PLEG): container finished" podID="08a0110f-428a-481d-b439-bc16e6837dc3" containerID="f8ab4ef90bd31d20c6033eb943d9a0a9a88a0d10339df1ff4a08e1b1232fe783" exitCode=137 Mar 21 05:15:22 crc kubenswrapper[4580]: I0321 05:15:22.337740 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587cfc8688-265kc" event={"ID":"08a0110f-428a-481d-b439-bc16e6837dc3","Type":"ContainerDied","Data":"f8ab4ef90bd31d20c6033eb943d9a0a9a88a0d10339df1ff4a08e1b1232fe783"} Mar 21 05:15:22 crc kubenswrapper[4580]: I0321 05:15:22.337769 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587cfc8688-265kc" event={"ID":"08a0110f-428a-481d-b439-bc16e6837dc3","Type":"ContainerStarted","Data":"aabef473a7fedba8211603363e3d7574d3a24bbc5ab5b0fe74504ddddca72333"} Mar 21 05:15:22 crc kubenswrapper[4580]: I0321 05:15:22.676417 4580 scope.go:117] "RemoveContainer" containerID="19fb47284615f4db4d3ee3b8a1bb2963d50724cdbe63d92f0b19442506b6bf5b" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.053950 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.098119 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7d379b-3471-4612-8b97-f684dbab8e8e-run-httpd\") pod \"dc7d379b-3471-4612-8b97-f684dbab8e8e\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.098164 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-scripts\") pod \"dc7d379b-3471-4612-8b97-f684dbab8e8e\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.098269 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-combined-ca-bundle\") pod \"dc7d379b-3471-4612-8b97-f684dbab8e8e\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.098307 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-sg-core-conf-yaml\") pod \"dc7d379b-3471-4612-8b97-f684dbab8e8e\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.098384 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-config-data\") pod \"dc7d379b-3471-4612-8b97-f684dbab8e8e\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.098411 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-ceilometer-tls-certs\") pod \"dc7d379b-3471-4612-8b97-f684dbab8e8e\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.098435 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7d379b-3471-4612-8b97-f684dbab8e8e-log-httpd\") pod \"dc7d379b-3471-4612-8b97-f684dbab8e8e\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.098473 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkxrn\" (UniqueName: \"kubernetes.io/projected/dc7d379b-3471-4612-8b97-f684dbab8e8e-kube-api-access-qkxrn\") pod \"dc7d379b-3471-4612-8b97-f684dbab8e8e\" (UID: \"dc7d379b-3471-4612-8b97-f684dbab8e8e\") " Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.099799 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc7d379b-3471-4612-8b97-f684dbab8e8e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc7d379b-3471-4612-8b97-f684dbab8e8e" (UID: "dc7d379b-3471-4612-8b97-f684dbab8e8e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.128571 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc7d379b-3471-4612-8b97-f684dbab8e8e-kube-api-access-qkxrn" (OuterVolumeSpecName: "kube-api-access-qkxrn") pod "dc7d379b-3471-4612-8b97-f684dbab8e8e" (UID: "dc7d379b-3471-4612-8b97-f684dbab8e8e"). InnerVolumeSpecName "kube-api-access-qkxrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.130740 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-scripts" (OuterVolumeSpecName: "scripts") pod "dc7d379b-3471-4612-8b97-f684dbab8e8e" (UID: "dc7d379b-3471-4612-8b97-f684dbab8e8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.130941 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc7d379b-3471-4612-8b97-f684dbab8e8e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc7d379b-3471-4612-8b97-f684dbab8e8e" (UID: "dc7d379b-3471-4612-8b97-f684dbab8e8e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.174636 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "dc7d379b-3471-4612-8b97-f684dbab8e8e" (UID: "dc7d379b-3471-4612-8b97-f684dbab8e8e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.202686 4580 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7d379b-3471-4612-8b97-f684dbab8e8e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.202721 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.202733 4580 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.202749 4580 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc7d379b-3471-4612-8b97-f684dbab8e8e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.202761 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkxrn\" (UniqueName: \"kubernetes.io/projected/dc7d379b-3471-4612-8b97-f684dbab8e8e-kube-api-access-qkxrn\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.212755 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dc7d379b-3471-4612-8b97-f684dbab8e8e" (UID: "dc7d379b-3471-4612-8b97-f684dbab8e8e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.232501 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc7d379b-3471-4612-8b97-f684dbab8e8e" (UID: "dc7d379b-3471-4612-8b97-f684dbab8e8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.259179 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-config-data" (OuterVolumeSpecName: "config-data") pod "dc7d379b-3471-4612-8b97-f684dbab8e8e" (UID: "dc7d379b-3471-4612-8b97-f684dbab8e8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.304703 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.304736 4580 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.304747 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7d379b-3471-4612-8b97-f684dbab8e8e-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.354360 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67655f8b6-mbx6n" event={"ID":"a03ce0fa-f7e8-4b48-bbea-95807f14dd26","Type":"ContainerStarted","Data":"f660370a5b85c0757c978411d4b13c5ed188b23f7b881d8e81f31c5eac41a537"} Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.379004 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc7d379b-3471-4612-8b97-f684dbab8e8e","Type":"ContainerDied","Data":"c5c0a8af7dab76dec7e5c81939bb01d08d6a975f0b2d8d62eaf2d92e85b4939b"} Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.379128 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.380062 4580 scope.go:117] "RemoveContainer" containerID="24196a49e4971d8d2b205ebc5a78d8102c8e522db828595d6203623ddfdf1125" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.427339 4580 scope.go:117] "RemoveContainer" containerID="2dd91e2dd9585cb6ba9b33282cb1ecfb8942549a5d89bd25c04a64043772178f" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.433198 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.472603 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.488057 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:15:23 crc kubenswrapper[4580]: E0321 05:15:23.488549 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="proxy-httpd" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.488575 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="proxy-httpd" Mar 21 05:15:23 crc kubenswrapper[4580]: E0321 05:15:23.488601 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="ceilometer-central-agent" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.488610 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="ceilometer-central-agent" Mar 21 05:15:23 crc kubenswrapper[4580]: E0321 05:15:23.488620 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="sg-core" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.488627 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="sg-core" Mar 21 05:15:23 crc kubenswrapper[4580]: E0321 05:15:23.488655 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0c7395-a399-416c-aa03-231a27008b0d" containerName="collect-profiles" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.488662 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0c7395-a399-416c-aa03-231a27008b0d" containerName="collect-profiles" Mar 21 05:15:23 crc kubenswrapper[4580]: E0321 05:15:23.488677 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="ceilometer-notification-agent" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.488710 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="ceilometer-notification-agent" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.488923 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="sg-core" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.488942 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="ceilometer-notification-agent" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.488961 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa0c7395-a399-416c-aa03-231a27008b0d" containerName="collect-profiles" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.488972 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="ceilometer-central-agent" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.488982 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" containerName="proxy-httpd" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.490814 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.492852 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.493115 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.493271 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.503673 4580 scope.go:117] "RemoveContainer" containerID="be4fa2b8f0d4118ec08cec90330732da336f4d1d2e7858ede3fb865b1a5c6688" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.530026 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-config-data\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.530086 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2gt8\" (UniqueName: \"kubernetes.io/projected/e3333784-13bd-4bc3-b52b-97899001daaf-kube-api-access-m2gt8\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.530184 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3333784-13bd-4bc3-b52b-97899001daaf-run-httpd\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.530251 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-scripts\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.530269 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3333784-13bd-4bc3-b52b-97899001daaf-log-httpd\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.530282 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.530298 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.530318 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.530889 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.555892 4580 scope.go:117] "RemoveContainer" containerID="4ca3246167dcb1c76c504c98bd77be031d6cc2e005a0b8367694aa5c990a190d" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.631727 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2gt8\" (UniqueName: \"kubernetes.io/projected/e3333784-13bd-4bc3-b52b-97899001daaf-kube-api-access-m2gt8\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.631867 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3333784-13bd-4bc3-b52b-97899001daaf-run-httpd\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.631927 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-scripts\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.631949 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3333784-13bd-4bc3-b52b-97899001daaf-log-httpd\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.631966 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.631986 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.632008 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.632099 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-config-data\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.633447 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3333784-13bd-4bc3-b52b-97899001daaf-log-httpd\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.633615 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3333784-13bd-4bc3-b52b-97899001daaf-run-httpd\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.639635 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-scripts\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.639902 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.640094 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-config-data\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.640502 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.642083 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc7d379b-3471-4612-8b97-f684dbab8e8e" path="/var/lib/kubelet/pods/dc7d379b-3471-4612-8b97-f684dbab8e8e/volumes" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.646585 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.655358 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2gt8\" (UniqueName: \"kubernetes.io/projected/e3333784-13bd-4bc3-b52b-97899001daaf-kube-api-access-m2gt8\") pod \"ceilometer-0\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " pod="openstack/ceilometer-0" Mar 21 05:15:23 crc kubenswrapper[4580]: I0321 05:15:23.844932 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:15:24 crc kubenswrapper[4580]: I0321 05:15:24.351535 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:15:24 crc kubenswrapper[4580]: W0321 05:15:24.353930 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3333784_13bd_4bc3_b52b_97899001daaf.slice/crio-ff23de3c89053ee9ed2548541eef5a62f4ff6d1cb1e54dbd1749c4af8362766f WatchSource:0}: Error finding container ff23de3c89053ee9ed2548541eef5a62f4ff6d1cb1e54dbd1749c4af8362766f: Status 404 returned error can't find the container with id ff23de3c89053ee9ed2548541eef5a62f4ff6d1cb1e54dbd1749c4af8362766f Mar 21 05:15:24 crc kubenswrapper[4580]: I0321 05:15:24.389550 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3333784-13bd-4bc3-b52b-97899001daaf","Type":"ContainerStarted","Data":"ff23de3c89053ee9ed2548541eef5a62f4ff6d1cb1e54dbd1749c4af8362766f"} Mar 21 05:15:25 crc kubenswrapper[4580]: I0321 05:15:25.401148 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3333784-13bd-4bc3-b52b-97899001daaf","Type":"ContainerStarted","Data":"c8682db5f1111effe548a1609bfc5a39ca2f9fac8b81dab5af5edffa38454eef"} Mar 21 05:15:26 crc kubenswrapper[4580]: I0321 05:15:26.413214 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3333784-13bd-4bc3-b52b-97899001daaf","Type":"ContainerStarted","Data":"caa04f472ba51c65977ef6b911f99c99eaa39ad90fdc91393dec2d6cf5a43641"} Mar 21 05:15:28 crc kubenswrapper[4580]: I0321 05:15:28.436812 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3333784-13bd-4bc3-b52b-97899001daaf","Type":"ContainerStarted","Data":"e1f39073542de59327a521c9b5471a16d8975c3a39f2d713395243dfca36553f"} Mar 21 05:15:29 crc kubenswrapper[4580]: I0321 05:15:29.447949 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3333784-13bd-4bc3-b52b-97899001daaf","Type":"ContainerStarted","Data":"5b0caee2f00fca700e7409e04cd8b8edfc633add411f4111e7ae6af73bbf1007"} Mar 21 05:15:29 crc kubenswrapper[4580]: I0321 05:15:29.448350 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 05:15:29 crc kubenswrapper[4580]: I0321 05:15:29.478859 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.778048858 podStartE2EDuration="6.478836841s" podCreationTimestamp="2026-03-21 05:15:23 +0000 UTC" firstStartedPulling="2026-03-21 05:15:24.356525147 +0000 UTC m=+1429.439108765" lastFinishedPulling="2026-03-21 05:15:29.05731312 +0000 UTC m=+1434.139896748" observedRunningTime="2026-03-21 05:15:29.472349568 +0000 UTC m=+1434.554933216" watchObservedRunningTime="2026-03-21 05:15:29.478836841 +0000 UTC m=+1434.561420469" Mar 21 05:15:31 crc kubenswrapper[4580]: I0321 05:15:31.393944 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:15:31 crc kubenswrapper[4580]: I0321 05:15:31.394297 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:15:31 crc kubenswrapper[4580]: I0321 05:15:31.395375 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:15:31 crc kubenswrapper[4580]: I0321 05:15:31.506958 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:15:31 crc kubenswrapper[4580]: I0321 05:15:31.508239 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:15:32 crc kubenswrapper[4580]: I0321 05:15:32.484143 4580 generic.go:334] "Generic (PLEG): container finished" podID="d2804338-19fa-40f9-9945-23cabe223f46" containerID="e1e71939e2eba028b0abb4f53083b37edb6d7a87fb7124ab1f067b21bbb4c602" exitCode=0 Mar 21 05:15:32 crc kubenswrapper[4580]: I0321 05:15:32.484223 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vsxgp" event={"ID":"d2804338-19fa-40f9-9945-23cabe223f46","Type":"ContainerDied","Data":"e1e71939e2eba028b0abb4f53083b37edb6d7a87fb7124ab1f067b21bbb4c602"} Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.015170 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.159629 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-config-data\") pod \"d2804338-19fa-40f9-9945-23cabe223f46\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.160053 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x5kw\" (UniqueName: \"kubernetes.io/projected/d2804338-19fa-40f9-9945-23cabe223f46-kube-api-access-9x5kw\") pod \"d2804338-19fa-40f9-9945-23cabe223f46\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.160170 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-combined-ca-bundle\") pod \"d2804338-19fa-40f9-9945-23cabe223f46\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.160201 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-scripts\") pod \"d2804338-19fa-40f9-9945-23cabe223f46\" (UID: \"d2804338-19fa-40f9-9945-23cabe223f46\") " Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.167108 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2804338-19fa-40f9-9945-23cabe223f46-kube-api-access-9x5kw" (OuterVolumeSpecName: "kube-api-access-9x5kw") pod "d2804338-19fa-40f9-9945-23cabe223f46" (UID: "d2804338-19fa-40f9-9945-23cabe223f46"). InnerVolumeSpecName "kube-api-access-9x5kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.186471 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-scripts" (OuterVolumeSpecName: "scripts") pod "d2804338-19fa-40f9-9945-23cabe223f46" (UID: "d2804338-19fa-40f9-9945-23cabe223f46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.200900 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2804338-19fa-40f9-9945-23cabe223f46" (UID: "d2804338-19fa-40f9-9945-23cabe223f46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.235949 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-config-data" (OuterVolumeSpecName: "config-data") pod "d2804338-19fa-40f9-9945-23cabe223f46" (UID: "d2804338-19fa-40f9-9945-23cabe223f46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.263157 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.263196 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x5kw\" (UniqueName: \"kubernetes.io/projected/d2804338-19fa-40f9-9945-23cabe223f46-kube-api-access-9x5kw\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.264181 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.264228 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2804338-19fa-40f9-9945-23cabe223f46-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.507578 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vsxgp" event={"ID":"d2804338-19fa-40f9-9945-23cabe223f46","Type":"ContainerDied","Data":"b4b22c6ba1ca3021c6406d6d8137df67b00e20357eb26afb58670fec14d1be42"} Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.507623 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4b22c6ba1ca3021c6406d6d8137df67b00e20357eb26afb58670fec14d1be42" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.507701 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vsxgp" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.663307 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 21 05:15:34 crc kubenswrapper[4580]: E0321 05:15:34.663734 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2804338-19fa-40f9-9945-23cabe223f46" containerName="nova-cell0-conductor-db-sync" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.663750 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2804338-19fa-40f9-9945-23cabe223f46" containerName="nova-cell0-conductor-db-sync" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.664029 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2804338-19fa-40f9-9945-23cabe223f46" containerName="nova-cell0-conductor-db-sync" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.664722 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.670622 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gj22l" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.673763 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc118c0b-9b79-4e70-a775-a437c1b83b2c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"dc118c0b-9b79-4e70-a775-a437c1b83b2c\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.673940 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc118c0b-9b79-4e70-a775-a437c1b83b2c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"dc118c0b-9b79-4e70-a775-a437c1b83b2c\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.673975 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7wbm\" (UniqueName: \"kubernetes.io/projected/dc118c0b-9b79-4e70-a775-a437c1b83b2c-kube-api-access-b7wbm\") pod \"nova-cell0-conductor-0\" (UID: \"dc118c0b-9b79-4e70-a775-a437c1b83b2c\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.676406 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.687752 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.775528 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc118c0b-9b79-4e70-a775-a437c1b83b2c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"dc118c0b-9b79-4e70-a775-a437c1b83b2c\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.775913 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7wbm\" (UniqueName: \"kubernetes.io/projected/dc118c0b-9b79-4e70-a775-a437c1b83b2c-kube-api-access-b7wbm\") pod \"nova-cell0-conductor-0\" (UID: \"dc118c0b-9b79-4e70-a775-a437c1b83b2c\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.776690 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc118c0b-9b79-4e70-a775-a437c1b83b2c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"dc118c0b-9b79-4e70-a775-a437c1b83b2c\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.780297 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc118c0b-9b79-4e70-a775-a437c1b83b2c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"dc118c0b-9b79-4e70-a775-a437c1b83b2c\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.781390 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc118c0b-9b79-4e70-a775-a437c1b83b2c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"dc118c0b-9b79-4e70-a775-a437c1b83b2c\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.803446 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7wbm\" (UniqueName: \"kubernetes.io/projected/dc118c0b-9b79-4e70-a775-a437c1b83b2c-kube-api-access-b7wbm\") pod \"nova-cell0-conductor-0\" (UID: \"dc118c0b-9b79-4e70-a775-a437c1b83b2c\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:15:34 crc kubenswrapper[4580]: I0321 05:15:34.986008 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 21 05:15:35 crc kubenswrapper[4580]: I0321 05:15:35.474496 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 21 05:15:35 crc kubenswrapper[4580]: I0321 05:15:35.517565 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"dc118c0b-9b79-4e70-a775-a437c1b83b2c","Type":"ContainerStarted","Data":"f6aa365c5482daad3072f8195c52d36934906502544f40e1d57cf8c83bb13c60"} Mar 21 05:15:36 crc kubenswrapper[4580]: I0321 05:15:36.529822 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"dc118c0b-9b79-4e70-a775-a437c1b83b2c","Type":"ContainerStarted","Data":"0a1bc1c46a84698cd600099ca75a8a413b0bd25a6ae40301adfb2899d4942ea3"} Mar 21 05:15:36 crc kubenswrapper[4580]: I0321 05:15:36.530396 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 21 05:15:36 crc kubenswrapper[4580]: I0321 05:15:36.563477 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.563450869 podStartE2EDuration="2.563450869s" podCreationTimestamp="2026-03-21 05:15:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:15:36.554005697 +0000 UTC m=+1441.636589335" watchObservedRunningTime="2026-03-21 05:15:36.563450869 +0000 UTC m=+1441.646034507" Mar 21 05:15:41 crc kubenswrapper[4580]: I0321 05:15:41.393600 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:15:41 crc kubenswrapper[4580]: I0321 05:15:41.509401 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.018976 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.732977 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-jw68q"] Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.734747 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.740265 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.744130 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.756256 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jw68q"] Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.837708 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-config-data\") pod \"nova-cell0-cell-mapping-jw68q\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.837812 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmvwh\" (UniqueName: \"kubernetes.io/projected/c1da4990-e129-41f6-acca-138ab10c03cc-kube-api-access-cmvwh\") pod \"nova-cell0-cell-mapping-jw68q\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.837846 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-scripts\") pod \"nova-cell0-cell-mapping-jw68q\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.838027 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jw68q\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.940315 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jw68q\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.940642 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-config-data\") pod \"nova-cell0-cell-mapping-jw68q\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.940674 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmvwh\" (UniqueName: \"kubernetes.io/projected/c1da4990-e129-41f6-acca-138ab10c03cc-kube-api-access-cmvwh\") pod \"nova-cell0-cell-mapping-jw68q\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.940691 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-scripts\") pod \"nova-cell0-cell-mapping-jw68q\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.946692 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-scripts\") pod \"nova-cell0-cell-mapping-jw68q\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.947826 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-config-data\") pod \"nova-cell0-cell-mapping-jw68q\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:15:45 crc kubenswrapper[4580]: I0321 05:15:45.973911 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jw68q\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.038162 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmvwh\" (UniqueName: \"kubernetes.io/projected/c1da4990-e129-41f6-acca-138ab10c03cc-kube-api-access-cmvwh\") pod \"nova-cell0-cell-mapping-jw68q\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.039764 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.041292 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.051232 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.078045 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.082242 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.158930 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ac92af5-c44d-488d-a784-f028b868fc24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2ac92af5-c44d-488d-a784-f028b868fc24\") " pod="openstack/nova-scheduler-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.158988 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v75kw\" (UniqueName: \"kubernetes.io/projected/2ac92af5-c44d-488d-a784-f028b868fc24-kube-api-access-v75kw\") pod \"nova-scheduler-0\" (UID: \"2ac92af5-c44d-488d-a784-f028b868fc24\") " pod="openstack/nova-scheduler-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.159027 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ac92af5-c44d-488d-a784-f028b868fc24-config-data\") pod \"nova-scheduler-0\" (UID: \"2ac92af5-c44d-488d-a784-f028b868fc24\") " pod="openstack/nova-scheduler-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.260854 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ac92af5-c44d-488d-a784-f028b868fc24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2ac92af5-c44d-488d-a784-f028b868fc24\") " pod="openstack/nova-scheduler-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.260897 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v75kw\" (UniqueName: \"kubernetes.io/projected/2ac92af5-c44d-488d-a784-f028b868fc24-kube-api-access-v75kw\") pod \"nova-scheduler-0\" (UID: \"2ac92af5-c44d-488d-a784-f028b868fc24\") " pod="openstack/nova-scheduler-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.260963 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ac92af5-c44d-488d-a784-f028b868fc24-config-data\") pod \"nova-scheduler-0\" (UID: \"2ac92af5-c44d-488d-a784-f028b868fc24\") " pod="openstack/nova-scheduler-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.263515 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.266115 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ac92af5-c44d-488d-a784-f028b868fc24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2ac92af5-c44d-488d-a784-f028b868fc24\") " pod="openstack/nova-scheduler-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.267191 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.273133 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ac92af5-c44d-488d-a784-f028b868fc24-config-data\") pod \"nova-scheduler-0\" (UID: \"2ac92af5-c44d-488d-a784-f028b868fc24\") " pod="openstack/nova-scheduler-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.308163 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.356796 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.363538 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09860a85-c2e1-464a-9c6e-8361a90aa306-logs\") pod \"nova-api-0\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " pod="openstack/nova-api-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.363601 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7dw7\" (UniqueName: \"kubernetes.io/projected/09860a85-c2e1-464a-9c6e-8361a90aa306-kube-api-access-b7dw7\") pod \"nova-api-0\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " pod="openstack/nova-api-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.363626 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09860a85-c2e1-464a-9c6e-8361a90aa306-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " pod="openstack/nova-api-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.363686 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09860a85-c2e1-464a-9c6e-8361a90aa306-config-data\") pod \"nova-api-0\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " pod="openstack/nova-api-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.396336 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v75kw\" (UniqueName: \"kubernetes.io/projected/2ac92af5-c44d-488d-a784-f028b868fc24-kube-api-access-v75kw\") pod \"nova-scheduler-0\" (UID: \"2ac92af5-c44d-488d-a784-f028b868fc24\") " pod="openstack/nova-scheduler-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.422537 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.423903 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.435750 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.436542 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.464941 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09860a85-c2e1-464a-9c6e-8361a90aa306-logs\") pod \"nova-api-0\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " pod="openstack/nova-api-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.465006 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7dw7\" (UniqueName: \"kubernetes.io/projected/09860a85-c2e1-464a-9c6e-8361a90aa306-kube-api-access-b7dw7\") pod \"nova-api-0\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " pod="openstack/nova-api-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.465030 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz7p8\" (UniqueName: \"kubernetes.io/projected/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-kube-api-access-qz7p8\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.465061 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09860a85-c2e1-464a-9c6e-8361a90aa306-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " pod="openstack/nova-api-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.465124 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09860a85-c2e1-464a-9c6e-8361a90aa306-config-data\") pod \"nova-api-0\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " pod="openstack/nova-api-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.465171 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.465225 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.465830 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09860a85-c2e1-464a-9c6e-8361a90aa306-logs\") pod \"nova-api-0\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " pod="openstack/nova-api-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.505285 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09860a85-c2e1-464a-9c6e-8361a90aa306-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " pod="openstack/nova-api-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.509683 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09860a85-c2e1-464a-9c6e-8361a90aa306-config-data\") pod \"nova-api-0\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " pod="openstack/nova-api-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.514412 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.570159 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.570275 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.570369 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz7p8\" (UniqueName: \"kubernetes.io/projected/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-kube-api-access-qz7p8\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.574945 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.583529 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7dw7\" (UniqueName: \"kubernetes.io/projected/09860a85-c2e1-464a-9c6e-8361a90aa306-kube-api-access-b7dw7\") pod \"nova-api-0\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " pod="openstack/nova-api-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.591119 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.591887 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.680544 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz7p8\" (UniqueName: \"kubernetes.io/projected/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-kube-api-access-qz7p8\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.770928 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.790855 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.792959 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.821044 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.877179 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " pod="openstack/nova-metadata-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.877263 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-config-data\") pod \"nova-metadata-0\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " pod="openstack/nova-metadata-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.877304 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-logs\") pod \"nova-metadata-0\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " pod="openstack/nova-metadata-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.877447 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5xvw\" (UniqueName: \"kubernetes.io/projected/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-kube-api-access-n5xvw\") pod \"nova-metadata-0\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " pod="openstack/nova-metadata-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.936863 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.980408 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5xvw\" (UniqueName: \"kubernetes.io/projected/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-kube-api-access-n5xvw\") pod \"nova-metadata-0\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " pod="openstack/nova-metadata-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.980482 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " pod="openstack/nova-metadata-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.980519 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-config-data\") pod \"nova-metadata-0\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " pod="openstack/nova-metadata-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.980556 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-logs\") pod \"nova-metadata-0\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " pod="openstack/nova-metadata-0" Mar 21 05:15:46 crc kubenswrapper[4580]: I0321 05:15:46.981020 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-logs\") pod \"nova-metadata-0\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " pod="openstack/nova-metadata-0" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.004800 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " pod="openstack/nova-metadata-0" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.018569 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-config-data\") pod \"nova-metadata-0\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " pod="openstack/nova-metadata-0" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.050888 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5xvw\" (UniqueName: \"kubernetes.io/projected/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-kube-api-access-n5xvw\") pod \"nova-metadata-0\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " pod="openstack/nova-metadata-0" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.174573 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-z5g44"] Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.182753 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.211940 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.273343 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-z5g44"] Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.295195 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.295286 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-dns-svc\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.295328 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.295368 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhb4t\" (UniqueName: \"kubernetes.io/projected/e5fbadc4-b849-4ff9-b723-acc959e19b70-kube-api-access-zhb4t\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.295413 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.295429 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-config\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.398989 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.399051 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhb4t\" (UniqueName: \"kubernetes.io/projected/e5fbadc4-b849-4ff9-b723-acc959e19b70-kube-api-access-zhb4t\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.399117 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.399137 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-config\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.399190 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.399246 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-dns-svc\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.400430 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-dns-svc\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.400472 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.401177 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-config\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.401397 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.401803 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.478007 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhb4t\" (UniqueName: \"kubernetes.io/projected/e5fbadc4-b849-4ff9-b723-acc959e19b70-kube-api-access-zhb4t\") pod \"dnsmasq-dns-757b4f8459-z5g44\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.562270 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.592823 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jw68q"] Mar 21 05:15:47 crc kubenswrapper[4580]: I0321 05:15:47.928459 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:15:48 crc kubenswrapper[4580]: I0321 05:15:48.148168 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:15:48 crc kubenswrapper[4580]: W0321 05:15:48.185661 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c7c9415_46a1_40aa_96bc_4d3910f0c5c5.slice/crio-41795fa327da864496db63d0c17450c756f67b6cdeb4ed1bb38ad0a94c8812d4 WatchSource:0}: Error finding container 41795fa327da864496db63d0c17450c756f67b6cdeb4ed1bb38ad0a94c8812d4: Status 404 returned error can't find the container with id 41795fa327da864496db63d0c17450c756f67b6cdeb4ed1bb38ad0a94c8812d4 Mar 21 05:15:48 crc kubenswrapper[4580]: I0321 05:15:48.189252 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:15:48 crc kubenswrapper[4580]: I0321 05:15:48.390629 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:15:48 crc kubenswrapper[4580]: I0321 05:15:48.703327 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-z5g44"] Mar 21 05:15:48 crc kubenswrapper[4580]: W0321 05:15:48.725092 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5fbadc4_b849_4ff9_b723_acc959e19b70.slice/crio-9e844b45389fd9d9d67a850b8efec6b839cfa275bd1327ae11655ecaa94a000f WatchSource:0}: Error finding container 9e844b45389fd9d9d67a850b8efec6b839cfa275bd1327ae11655ecaa94a000f: Status 404 returned error can't find the container with id 9e844b45389fd9d9d67a850b8efec6b839cfa275bd1327ae11655ecaa94a000f Mar 21 05:15:48 crc kubenswrapper[4580]: I0321 05:15:48.757031 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-z5g44" event={"ID":"e5fbadc4-b849-4ff9-b723-acc959e19b70","Type":"ContainerStarted","Data":"9e844b45389fd9d9d67a850b8efec6b839cfa275bd1327ae11655ecaa94a000f"} Mar 21 05:15:48 crc kubenswrapper[4580]: I0321 05:15:48.773746 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jw68q" event={"ID":"c1da4990-e129-41f6-acca-138ab10c03cc","Type":"ContainerStarted","Data":"fe16d7cd3730205ea2ae2ee287da9eedbeeb418df9addf927524aa933f7be59b"} Mar 21 05:15:48 crc kubenswrapper[4580]: I0321 05:15:48.773822 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jw68q" event={"ID":"c1da4990-e129-41f6-acca-138ab10c03cc","Type":"ContainerStarted","Data":"664f31b7a1eb45d36abf3b766aefa3cc9aa5bc990a83996f63912922b5a3fe56"} Mar 21 05:15:48 crc kubenswrapper[4580]: I0321 05:15:48.812146 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-jw68q" podStartSLOduration=3.812125668 podStartE2EDuration="3.812125668s" podCreationTimestamp="2026-03-21 05:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:15:48.806289652 +0000 UTC m=+1453.888873280" watchObservedRunningTime="2026-03-21 05:15:48.812125668 +0000 UTC m=+1453.894709306" Mar 21 05:15:48 crc kubenswrapper[4580]: I0321 05:15:48.848134 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5","Type":"ContainerStarted","Data":"41795fa327da864496db63d0c17450c756f67b6cdeb4ed1bb38ad0a94c8812d4"} Mar 21 05:15:48 crc kubenswrapper[4580]: I0321 05:15:48.875017 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"09860a85-c2e1-464a-9c6e-8361a90aa306","Type":"ContainerStarted","Data":"125be47a6270d51294791dbb5370a2266866b564c4a8543558320e0d18d2ee02"} Mar 21 05:15:48 crc kubenswrapper[4580]: I0321 05:15:48.884030 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2ac92af5-c44d-488d-a784-f028b868fc24","Type":"ContainerStarted","Data":"ea8ee33184cddaafb822f4c7404bea3dae7678033b54da84a868b7bcf1cd4393"} Mar 21 05:15:48 crc kubenswrapper[4580]: I0321 05:15:48.917027 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a","Type":"ContainerStarted","Data":"5134a6a29cb441a54078b4a7f06a500fc2f74d39839c5ae42f65fbb58543a3d4"} Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.024626 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6xjtd"] Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.026341 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.032771 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.032854 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.050164 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6xjtd"] Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.093006 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-config-data\") pod \"nova-cell1-conductor-db-sync-6xjtd\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.108193 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-scripts\") pod \"nova-cell1-conductor-db-sync-6xjtd\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.108280 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts7jz\" (UniqueName: \"kubernetes.io/projected/394de333-b465-45df-8251-bb4ae573b135-kube-api-access-ts7jz\") pod \"nova-cell1-conductor-db-sync-6xjtd\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.108449 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6xjtd\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.210378 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-scripts\") pod \"nova-cell1-conductor-db-sync-6xjtd\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.210730 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts7jz\" (UniqueName: \"kubernetes.io/projected/394de333-b465-45df-8251-bb4ae573b135-kube-api-access-ts7jz\") pod \"nova-cell1-conductor-db-sync-6xjtd\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.210852 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6xjtd\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.210930 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-config-data\") pod \"nova-cell1-conductor-db-sync-6xjtd\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.225677 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6xjtd\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.225683 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-config-data\") pod \"nova-cell1-conductor-db-sync-6xjtd\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.228467 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-scripts\") pod \"nova-cell1-conductor-db-sync-6xjtd\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.255275 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts7jz\" (UniqueName: \"kubernetes.io/projected/394de333-b465-45df-8251-bb4ae573b135-kube-api-access-ts7jz\") pod \"nova-cell1-conductor-db-sync-6xjtd\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.457365 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.961341 4580 generic.go:334] "Generic (PLEG): container finished" podID="e5fbadc4-b849-4ff9-b723-acc959e19b70" containerID="205210abd5ae6b11199c300f7f528980e6136defb5c9afe2874a96e3b1a788a6" exitCode=0 Mar 21 05:15:49 crc kubenswrapper[4580]: I0321 05:15:49.969399 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-z5g44" event={"ID":"e5fbadc4-b849-4ff9-b723-acc959e19b70","Type":"ContainerDied","Data":"205210abd5ae6b11199c300f7f528980e6136defb5c9afe2874a96e3b1a788a6"} Mar 21 05:15:50 crc kubenswrapper[4580]: I0321 05:15:50.281269 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6xjtd"] Mar 21 05:15:50 crc kubenswrapper[4580]: W0321 05:15:50.288019 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod394de333_b465_45df_8251_bb4ae573b135.slice/crio-573152bced2baf0e80a57c250c00727b0a4d795288824ba98b4330247554e32a WatchSource:0}: Error finding container 573152bced2baf0e80a57c250c00727b0a4d795288824ba98b4330247554e32a: Status 404 returned error can't find the container with id 573152bced2baf0e80a57c250c00727b0a4d795288824ba98b4330247554e32a Mar 21 05:15:51 crc kubenswrapper[4580]: I0321 05:15:51.023438 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6xjtd" event={"ID":"394de333-b465-45df-8251-bb4ae573b135","Type":"ContainerStarted","Data":"9d272dbb0e955717fe563a21e66e5f4aa48f6d4a9f8f73530e65ba4d4bc33129"} Mar 21 05:15:51 crc kubenswrapper[4580]: I0321 05:15:51.023755 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6xjtd" event={"ID":"394de333-b465-45df-8251-bb4ae573b135","Type":"ContainerStarted","Data":"573152bced2baf0e80a57c250c00727b0a4d795288824ba98b4330247554e32a"} Mar 21 05:15:51 crc kubenswrapper[4580]: I0321 05:15:51.048115 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-z5g44" event={"ID":"e5fbadc4-b849-4ff9-b723-acc959e19b70","Type":"ContainerStarted","Data":"53a50eb9b4500ab1c51da206c75fa2e7ac479a687c959f3fd455e65baaa2a9a8"} Mar 21 05:15:51 crc kubenswrapper[4580]: I0321 05:15:51.049160 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:51 crc kubenswrapper[4580]: I0321 05:15:51.056620 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-6xjtd" podStartSLOduration=3.05659953 podStartE2EDuration="3.05659953s" podCreationTimestamp="2026-03-21 05:15:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:15:51.049187862 +0000 UTC m=+1456.131771510" watchObservedRunningTime="2026-03-21 05:15:51.05659953 +0000 UTC m=+1456.139183158" Mar 21 05:15:51 crc kubenswrapper[4580]: I0321 05:15:51.393803 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:15:51 crc kubenswrapper[4580]: I0321 05:15:51.394185 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:15:51 crc kubenswrapper[4580]: I0321 05:15:51.396075 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"aabef473a7fedba8211603363e3d7574d3a24bbc5ab5b0fe74504ddddca72333"} pod="openstack/horizon-587cfc8688-265kc" containerMessage="Container horizon failed startup probe, will be restarted" Mar 21 05:15:51 crc kubenswrapper[4580]: I0321 05:15:51.396420 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" containerID="cri-o://aabef473a7fedba8211603363e3d7574d3a24bbc5ab5b0fe74504ddddca72333" gracePeriod=30 Mar 21 05:15:51 crc kubenswrapper[4580]: I0321 05:15:51.508716 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 21 05:15:51 crc kubenswrapper[4580]: I0321 05:15:51.555262 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-z5g44" podStartSLOduration=4.555228421 podStartE2EDuration="4.555228421s" podCreationTimestamp="2026-03-21 05:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:15:51.087273299 +0000 UTC m=+1456.169856947" watchObservedRunningTime="2026-03-21 05:15:51.555228421 +0000 UTC m=+1456.637812049" Mar 21 05:15:51 crc kubenswrapper[4580]: I0321 05:15:51.578033 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:15:51 crc kubenswrapper[4580]: I0321 05:15:51.613639 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:15:53 crc kubenswrapper[4580]: I0321 05:15:53.910311 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 21 05:15:57 crc kubenswrapper[4580]: I0321 05:15:57.123684 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2ac92af5-c44d-488d-a784-f028b868fc24","Type":"ContainerStarted","Data":"99c20bb2ab3dd9840212595c1bde9dc38fcb723ffbb4957bd9fd6b012f1769d2"} Mar 21 05:15:57 crc kubenswrapper[4580]: I0321 05:15:57.127552 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a","Type":"ContainerStarted","Data":"151a2d4133d4fda411cb7e2461df9558ff73ef73cfc87d2612c67236b6aae24b"} Mar 21 05:15:57 crc kubenswrapper[4580]: I0321 05:15:57.129306 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5","Type":"ContainerStarted","Data":"6cce9a7d51a1669c0bc276a7b641ea973675dc3f163e477f746f96a06fb08257"} Mar 21 05:15:57 crc kubenswrapper[4580]: I0321 05:15:57.129599 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="5c7c9415-46a1-40aa-96bc-4d3910f0c5c5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://6cce9a7d51a1669c0bc276a7b641ea973675dc3f163e477f746f96a06fb08257" gracePeriod=30 Mar 21 05:15:57 crc kubenswrapper[4580]: I0321 05:15:57.131557 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"09860a85-c2e1-464a-9c6e-8361a90aa306","Type":"ContainerStarted","Data":"1065a064f685760b22908d8ea4a0ac2d7ca49180bc66c81305f1069451eba26f"} Mar 21 05:15:57 crc kubenswrapper[4580]: I0321 05:15:57.172302 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.045720638 podStartE2EDuration="12.172277533s" podCreationTimestamp="2026-03-21 05:15:45 +0000 UTC" firstStartedPulling="2026-03-21 05:15:47.922778638 +0000 UTC m=+1453.005362266" lastFinishedPulling="2026-03-21 05:15:56.049335543 +0000 UTC m=+1461.131919161" observedRunningTime="2026-03-21 05:15:57.169405436 +0000 UTC m=+1462.251989074" watchObservedRunningTime="2026-03-21 05:15:57.172277533 +0000 UTC m=+1462.254861161" Mar 21 05:15:57 crc kubenswrapper[4580]: I0321 05:15:57.200179 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.052671414 podStartE2EDuration="11.200158388s" podCreationTimestamp="2026-03-21 05:15:46 +0000 UTC" firstStartedPulling="2026-03-21 05:15:48.208847091 +0000 UTC m=+1453.291430719" lastFinishedPulling="2026-03-21 05:15:56.356334065 +0000 UTC m=+1461.438917693" observedRunningTime="2026-03-21 05:15:57.197815805 +0000 UTC m=+1462.280399463" watchObservedRunningTime="2026-03-21 05:15:57.200158388 +0000 UTC m=+1462.282742016" Mar 21 05:15:57 crc kubenswrapper[4580]: I0321 05:15:57.565021 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:15:57 crc kubenswrapper[4580]: I0321 05:15:57.681470 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-28d7v"] Mar 21 05:15:57 crc kubenswrapper[4580]: I0321 05:15:57.681739 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" podUID="03e4309f-b795-4c00-8058-616430f6ea8a" containerName="dnsmasq-dns" containerID="cri-o://8f421d9f786f076c9ee36dd8e72f6279949d0c1c7497da67bb7754fe4da7be07" gracePeriod=10 Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.161859 4580 generic.go:334] "Generic (PLEG): container finished" podID="03e4309f-b795-4c00-8058-616430f6ea8a" containerID="8f421d9f786f076c9ee36dd8e72f6279949d0c1c7497da67bb7754fe4da7be07" exitCode=0 Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.161978 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" event={"ID":"03e4309f-b795-4c00-8058-616430f6ea8a","Type":"ContainerDied","Data":"8f421d9f786f076c9ee36dd8e72f6279949d0c1c7497da67bb7754fe4da7be07"} Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.173567 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"09860a85-c2e1-464a-9c6e-8361a90aa306","Type":"ContainerStarted","Data":"c6c7332aac4bc8652c8c19a5f6a8c6aa6f416ad5f15fa7dfc2cf51f177204cbe"} Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.184563 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="09aefeeb-fcb1-46a5-91e1-dd49b7ca976a" containerName="nova-metadata-log" containerID="cri-o://151a2d4133d4fda411cb7e2461df9558ff73ef73cfc87d2612c67236b6aae24b" gracePeriod=30 Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.184908 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="09aefeeb-fcb1-46a5-91e1-dd49b7ca976a" containerName="nova-metadata-metadata" containerID="cri-o://e91bc741dfda5f6fd527ee94e9c01e60a3aec34ca4a7466b7e49b84456bfa05d" gracePeriod=30 Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.184940 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a","Type":"ContainerStarted","Data":"e91bc741dfda5f6fd527ee94e9c01e60a3aec34ca4a7466b7e49b84456bfa05d"} Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.244297 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.273380185 podStartE2EDuration="12.244279191s" podCreationTimestamp="2026-03-21 05:15:46 +0000 UTC" firstStartedPulling="2026-03-21 05:15:48.408505535 +0000 UTC m=+1453.491089153" lastFinishedPulling="2026-03-21 05:15:56.379404531 +0000 UTC m=+1461.461988159" observedRunningTime="2026-03-21 05:15:58.240255354 +0000 UTC m=+1463.322838982" watchObservedRunningTime="2026-03-21 05:15:58.244279191 +0000 UTC m=+1463.326862819" Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.245185 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.041726106 podStartE2EDuration="12.245180905s" podCreationTimestamp="2026-03-21 05:15:46 +0000 UTC" firstStartedPulling="2026-03-21 05:15:48.153947854 +0000 UTC m=+1453.236531482" lastFinishedPulling="2026-03-21 05:15:56.357402643 +0000 UTC m=+1461.439986281" observedRunningTime="2026-03-21 05:15:58.210793347 +0000 UTC m=+1463.293376985" watchObservedRunningTime="2026-03-21 05:15:58.245180905 +0000 UTC m=+1463.327764523" Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.764578 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.855692 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cqqt\" (UniqueName: \"kubernetes.io/projected/03e4309f-b795-4c00-8058-616430f6ea8a-kube-api-access-9cqqt\") pod \"03e4309f-b795-4c00-8058-616430f6ea8a\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.855735 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-dns-swift-storage-0\") pod \"03e4309f-b795-4c00-8058-616430f6ea8a\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.855753 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-ovsdbserver-sb\") pod \"03e4309f-b795-4c00-8058-616430f6ea8a\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.855772 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-dns-svc\") pod \"03e4309f-b795-4c00-8058-616430f6ea8a\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.855866 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-config\") pod \"03e4309f-b795-4c00-8058-616430f6ea8a\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.856134 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-ovsdbserver-nb\") pod \"03e4309f-b795-4c00-8058-616430f6ea8a\" (UID: \"03e4309f-b795-4c00-8058-616430f6ea8a\") " Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.871022 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03e4309f-b795-4c00-8058-616430f6ea8a-kube-api-access-9cqqt" (OuterVolumeSpecName: "kube-api-access-9cqqt") pod "03e4309f-b795-4c00-8058-616430f6ea8a" (UID: "03e4309f-b795-4c00-8058-616430f6ea8a"). InnerVolumeSpecName "kube-api-access-9cqqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.940511 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "03e4309f-b795-4c00-8058-616430f6ea8a" (UID: "03e4309f-b795-4c00-8058-616430f6ea8a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.959264 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cqqt\" (UniqueName: \"kubernetes.io/projected/03e4309f-b795-4c00-8058-616430f6ea8a-kube-api-access-9cqqt\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.959294 4580 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.960443 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-config" (OuterVolumeSpecName: "config") pod "03e4309f-b795-4c00-8058-616430f6ea8a" (UID: "03e4309f-b795-4c00-8058-616430f6ea8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.974600 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "03e4309f-b795-4c00-8058-616430f6ea8a" (UID: "03e4309f-b795-4c00-8058-616430f6ea8a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.975005 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "03e4309f-b795-4c00-8058-616430f6ea8a" (UID: "03e4309f-b795-4c00-8058-616430f6ea8a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:15:58 crc kubenswrapper[4580]: I0321 05:15:58.984915 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "03e4309f-b795-4c00-8058-616430f6ea8a" (UID: "03e4309f-b795-4c00-8058-616430f6ea8a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.061720 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.061757 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.061767 4580 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.061776 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03e4309f-b795-4c00-8058-616430f6ea8a-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.217819 4580 generic.go:334] "Generic (PLEG): container finished" podID="09aefeeb-fcb1-46a5-91e1-dd49b7ca976a" containerID="e91bc741dfda5f6fd527ee94e9c01e60a3aec34ca4a7466b7e49b84456bfa05d" exitCode=0 Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.217862 4580 generic.go:334] "Generic (PLEG): container finished" podID="09aefeeb-fcb1-46a5-91e1-dd49b7ca976a" containerID="151a2d4133d4fda411cb7e2461df9558ff73ef73cfc87d2612c67236b6aae24b" exitCode=143 Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.217969 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a","Type":"ContainerDied","Data":"e91bc741dfda5f6fd527ee94e9c01e60a3aec34ca4a7466b7e49b84456bfa05d"} Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.218011 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a","Type":"ContainerDied","Data":"151a2d4133d4fda411cb7e2461df9558ff73ef73cfc87d2612c67236b6aae24b"} Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.218032 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a","Type":"ContainerDied","Data":"5134a6a29cb441a54078b4a7f06a500fc2f74d39839c5ae42f65fbb58543a3d4"} Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.218047 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5134a6a29cb441a54078b4a7f06a500fc2f74d39839c5ae42f65fbb58543a3d4" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.230407 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.230411 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-28d7v" event={"ID":"03e4309f-b795-4c00-8058-616430f6ea8a","Type":"ContainerDied","Data":"9602c4f1b928dce3f6d73f264d3ce38b7d5063b4528308525f46a816767918c1"} Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.231744 4580 scope.go:117] "RemoveContainer" containerID="8f421d9f786f076c9ee36dd8e72f6279949d0c1c7497da67bb7754fe4da7be07" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.292553 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.305754 4580 scope.go:117] "RemoveContainer" containerID="7dcb1b7a7f920f07fde018a93c10c8821c95a73ee12fb049f533da5edadc5771" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.332195 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-28d7v"] Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.351319 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-28d7v"] Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.386324 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-logs\") pod \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.386465 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-combined-ca-bundle\") pod \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.386501 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5xvw\" (UniqueName: \"kubernetes.io/projected/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-kube-api-access-n5xvw\") pod \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.386532 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-config-data\") pod \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\" (UID: \"09aefeeb-fcb1-46a5-91e1-dd49b7ca976a\") " Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.387915 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-logs" (OuterVolumeSpecName: "logs") pod "09aefeeb-fcb1-46a5-91e1-dd49b7ca976a" (UID: "09aefeeb-fcb1-46a5-91e1-dd49b7ca976a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.411683 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-kube-api-access-n5xvw" (OuterVolumeSpecName: "kube-api-access-n5xvw") pod "09aefeeb-fcb1-46a5-91e1-dd49b7ca976a" (UID: "09aefeeb-fcb1-46a5-91e1-dd49b7ca976a"). InnerVolumeSpecName "kube-api-access-n5xvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.434208 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09aefeeb-fcb1-46a5-91e1-dd49b7ca976a" (UID: "09aefeeb-fcb1-46a5-91e1-dd49b7ca976a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.489256 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.489302 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5xvw\" (UniqueName: \"kubernetes.io/projected/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-kube-api-access-n5xvw\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.489315 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.519312 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-config-data" (OuterVolumeSpecName: "config-data") pod "09aefeeb-fcb1-46a5-91e1-dd49b7ca976a" (UID: "09aefeeb-fcb1-46a5-91e1-dd49b7ca976a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.591453 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:59 crc kubenswrapper[4580]: I0321 05:15:59.657267 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03e4309f-b795-4c00-8058-616430f6ea8a" path="/var/lib/kubelet/pods/03e4309f-b795-4c00-8058-616430f6ea8a/volumes" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.154854 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567836-4prkc"] Mar 21 05:16:00 crc kubenswrapper[4580]: E0321 05:16:00.155323 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03e4309f-b795-4c00-8058-616430f6ea8a" containerName="init" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.155350 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="03e4309f-b795-4c00-8058-616430f6ea8a" containerName="init" Mar 21 05:16:00 crc kubenswrapper[4580]: E0321 05:16:00.155386 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03e4309f-b795-4c00-8058-616430f6ea8a" containerName="dnsmasq-dns" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.155396 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="03e4309f-b795-4c00-8058-616430f6ea8a" containerName="dnsmasq-dns" Mar 21 05:16:00 crc kubenswrapper[4580]: E0321 05:16:00.155420 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09aefeeb-fcb1-46a5-91e1-dd49b7ca976a" containerName="nova-metadata-log" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.155432 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="09aefeeb-fcb1-46a5-91e1-dd49b7ca976a" containerName="nova-metadata-log" Mar 21 05:16:00 crc kubenswrapper[4580]: E0321 05:16:00.155450 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09aefeeb-fcb1-46a5-91e1-dd49b7ca976a" containerName="nova-metadata-metadata" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.155458 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="09aefeeb-fcb1-46a5-91e1-dd49b7ca976a" containerName="nova-metadata-metadata" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.155688 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="09aefeeb-fcb1-46a5-91e1-dd49b7ca976a" containerName="nova-metadata-log" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.155713 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="03e4309f-b795-4c00-8058-616430f6ea8a" containerName="dnsmasq-dns" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.155728 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="09aefeeb-fcb1-46a5-91e1-dd49b7ca976a" containerName="nova-metadata-metadata" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.156721 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-4prkc" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.169269 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.169370 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.169707 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.171335 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-4prkc"] Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.211401 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvhfs\" (UniqueName: \"kubernetes.io/projected/fe65e038-1d7f-470f-88ea-ef5352681356-kube-api-access-dvhfs\") pod \"auto-csr-approver-29567836-4prkc\" (UID: \"fe65e038-1d7f-470f-88ea-ef5352681356\") " pod="openshift-infra/auto-csr-approver-29567836-4prkc" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.251750 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.313080 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvhfs\" (UniqueName: \"kubernetes.io/projected/fe65e038-1d7f-470f-88ea-ef5352681356-kube-api-access-dvhfs\") pod \"auto-csr-approver-29567836-4prkc\" (UID: \"fe65e038-1d7f-470f-88ea-ef5352681356\") " pod="openshift-infra/auto-csr-approver-29567836-4prkc" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.332496 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.347253 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvhfs\" (UniqueName: \"kubernetes.io/projected/fe65e038-1d7f-470f-88ea-ef5352681356-kube-api-access-dvhfs\") pod \"auto-csr-approver-29567836-4prkc\" (UID: \"fe65e038-1d7f-470f-88ea-ef5352681356\") " pod="openshift-infra/auto-csr-approver-29567836-4prkc" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.356565 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.386598 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.397511 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.401341 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.402114 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.404042 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.517493 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-config-data\") pod \"nova-metadata-0\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.517919 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.518025 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c3ff363-eba3-4c3e-ae67-6064971743bb-logs\") pod \"nova-metadata-0\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.518154 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.518188 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggfcp\" (UniqueName: \"kubernetes.io/projected/4c3ff363-eba3-4c3e-ae67-6064971743bb-kube-api-access-ggfcp\") pod \"nova-metadata-0\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.546632 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-4prkc" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.619760 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.620089 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggfcp\" (UniqueName: \"kubernetes.io/projected/4c3ff363-eba3-4c3e-ae67-6064971743bb-kube-api-access-ggfcp\") pod \"nova-metadata-0\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.620137 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-config-data\") pod \"nova-metadata-0\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.620209 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.620276 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c3ff363-eba3-4c3e-ae67-6064971743bb-logs\") pod \"nova-metadata-0\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.626188 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c3ff363-eba3-4c3e-ae67-6064971743bb-logs\") pod \"nova-metadata-0\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.639182 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.639226 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggfcp\" (UniqueName: \"kubernetes.io/projected/4c3ff363-eba3-4c3e-ae67-6064971743bb-kube-api-access-ggfcp\") pod \"nova-metadata-0\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.660846 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-config-data\") pod \"nova-metadata-0\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.669793 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " pod="openstack/nova-metadata-0" Mar 21 05:16:00 crc kubenswrapper[4580]: I0321 05:16:00.735011 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:16:01 crc kubenswrapper[4580]: I0321 05:16:01.436459 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 21 05:16:01 crc kubenswrapper[4580]: I0321 05:16:01.458692 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:16:01 crc kubenswrapper[4580]: W0321 05:16:01.459423 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c3ff363_eba3_4c3e_ae67_6064971743bb.slice/crio-48c0aba20f369d5ae0b409d84c823614fc8292f1b99a77efc52a585eda22d6fc WatchSource:0}: Error finding container 48c0aba20f369d5ae0b409d84c823614fc8292f1b99a77efc52a585eda22d6fc: Status 404 returned error can't find the container with id 48c0aba20f369d5ae0b409d84c823614fc8292f1b99a77efc52a585eda22d6fc Mar 21 05:16:01 crc kubenswrapper[4580]: I0321 05:16:01.524745 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-4prkc"] Mar 21 05:16:01 crc kubenswrapper[4580]: I0321 05:16:01.633326 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09aefeeb-fcb1-46a5-91e1-dd49b7ca976a" path="/var/lib/kubelet/pods/09aefeeb-fcb1-46a5-91e1-dd49b7ca976a/volumes" Mar 21 05:16:01 crc kubenswrapper[4580]: I0321 05:16:01.772189 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:02 crc kubenswrapper[4580]: I0321 05:16:02.291842 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567836-4prkc" event={"ID":"fe65e038-1d7f-470f-88ea-ef5352681356","Type":"ContainerStarted","Data":"1154cfd7a2e7b838d9f145a24740a586e0715d7dcca3197ea6d78e5ca1475d5d"} Mar 21 05:16:02 crc kubenswrapper[4580]: I0321 05:16:02.298398 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c3ff363-eba3-4c3e-ae67-6064971743bb","Type":"ContainerStarted","Data":"7c50325509ba7c5a7261d7c3d98923bb9e1740ac4ca9414f94835b49fbf95c01"} Mar 21 05:16:02 crc kubenswrapper[4580]: I0321 05:16:02.298467 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c3ff363-eba3-4c3e-ae67-6064971743bb","Type":"ContainerStarted","Data":"48c0aba20f369d5ae0b409d84c823614fc8292f1b99a77efc52a585eda22d6fc"} Mar 21 05:16:03 crc kubenswrapper[4580]: I0321 05:16:03.310815 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c3ff363-eba3-4c3e-ae67-6064971743bb","Type":"ContainerStarted","Data":"ab91fadd278aba3406c5a2b8961a6c385e458c78adb9bc2bc25ddeb9cc772ab6"} Mar 21 05:16:03 crc kubenswrapper[4580]: I0321 05:16:03.313236 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567836-4prkc" event={"ID":"fe65e038-1d7f-470f-88ea-ef5352681356","Type":"ContainerStarted","Data":"f90afca2acfaaae581f27bc22e5b2a0a0d1e1168cc4e0427b1e6653bed4a6160"} Mar 21 05:16:03 crc kubenswrapper[4580]: I0321 05:16:03.346540 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.34651682 podStartE2EDuration="3.34651682s" podCreationTimestamp="2026-03-21 05:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:16:03.330710888 +0000 UTC m=+1468.413294526" watchObservedRunningTime="2026-03-21 05:16:03.34651682 +0000 UTC m=+1468.429100468" Mar 21 05:16:04 crc kubenswrapper[4580]: I0321 05:16:04.338666 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567836-4prkc" podStartSLOduration=3.260272347 podStartE2EDuration="4.338646396s" podCreationTimestamp="2026-03-21 05:16:00 +0000 UTC" firstStartedPulling="2026-03-21 05:16:01.531687506 +0000 UTC m=+1466.614271134" lastFinishedPulling="2026-03-21 05:16:02.610061555 +0000 UTC m=+1467.692645183" observedRunningTime="2026-03-21 05:16:04.336411456 +0000 UTC m=+1469.418995074" watchObservedRunningTime="2026-03-21 05:16:04.338646396 +0000 UTC m=+1469.421230024" Mar 21 05:16:04 crc kubenswrapper[4580]: I0321 05:16:04.593344 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 05:16:04 crc kubenswrapper[4580]: I0321 05:16:04.593409 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 05:16:06 crc kubenswrapper[4580]: I0321 05:16:06.342401 4580 generic.go:334] "Generic (PLEG): container finished" podID="c1da4990-e129-41f6-acca-138ab10c03cc" containerID="fe16d7cd3730205ea2ae2ee287da9eedbeeb418df9addf927524aa933f7be59b" exitCode=0 Mar 21 05:16:06 crc kubenswrapper[4580]: I0321 05:16:06.342498 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jw68q" event={"ID":"c1da4990-e129-41f6-acca-138ab10c03cc","Type":"ContainerDied","Data":"fe16d7cd3730205ea2ae2ee287da9eedbeeb418df9addf927524aa933f7be59b"} Mar 21 05:16:06 crc kubenswrapper[4580]: I0321 05:16:06.437023 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 21 05:16:06 crc kubenswrapper[4580]: I0321 05:16:06.464223 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 21 05:16:06 crc kubenswrapper[4580]: I0321 05:16:06.513990 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:16:06 crc kubenswrapper[4580]: I0321 05:16:06.514091 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:16:06 crc kubenswrapper[4580]: I0321 05:16:06.515606 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"f660370a5b85c0757c978411d4b13c5ed188b23f7b881d8e81f31c5eac41a537"} pod="openstack/horizon-67655f8b6-mbx6n" containerMessage="Container horizon failed startup probe, will be restarted" Mar 21 05:16:06 crc kubenswrapper[4580]: I0321 05:16:06.515653 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" containerID="cri-o://f660370a5b85c0757c978411d4b13c5ed188b23f7b881d8e81f31c5eac41a537" gracePeriod=30 Mar 21 05:16:06 crc kubenswrapper[4580]: I0321 05:16:06.592909 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 05:16:06 crc kubenswrapper[4580]: I0321 05:16:06.592958 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.388484 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.678027 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="09860a85-c2e1-464a-9c6e-8361a90aa306" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.678630 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="09860a85-c2e1-464a-9c6e-8361a90aa306" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.792976 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.878824 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-config-data\") pod \"c1da4990-e129-41f6-acca-138ab10c03cc\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.878875 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-scripts\") pod \"c1da4990-e129-41f6-acca-138ab10c03cc\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.878916 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmvwh\" (UniqueName: \"kubernetes.io/projected/c1da4990-e129-41f6-acca-138ab10c03cc-kube-api-access-cmvwh\") pod \"c1da4990-e129-41f6-acca-138ab10c03cc\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.879082 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-combined-ca-bundle\") pod \"c1da4990-e129-41f6-acca-138ab10c03cc\" (UID: \"c1da4990-e129-41f6-acca-138ab10c03cc\") " Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.887741 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1da4990-e129-41f6-acca-138ab10c03cc-kube-api-access-cmvwh" (OuterVolumeSpecName: "kube-api-access-cmvwh") pod "c1da4990-e129-41f6-acca-138ab10c03cc" (UID: "c1da4990-e129-41f6-acca-138ab10c03cc"). InnerVolumeSpecName "kube-api-access-cmvwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.890578 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-scripts" (OuterVolumeSpecName: "scripts") pod "c1da4990-e129-41f6-acca-138ab10c03cc" (UID: "c1da4990-e129-41f6-acca-138ab10c03cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.914007 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1da4990-e129-41f6-acca-138ab10c03cc" (UID: "c1da4990-e129-41f6-acca-138ab10c03cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.918141 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-config-data" (OuterVolumeSpecName: "config-data") pod "c1da4990-e129-41f6-acca-138ab10c03cc" (UID: "c1da4990-e129-41f6-acca-138ab10c03cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.981747 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.982085 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.982177 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmvwh\" (UniqueName: \"kubernetes.io/projected/c1da4990-e129-41f6-acca-138ab10c03cc-kube-api-access-cmvwh\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:07 crc kubenswrapper[4580]: I0321 05:16:07.982251 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1da4990-e129-41f6-acca-138ab10c03cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:08 crc kubenswrapper[4580]: I0321 05:16:08.360290 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jw68q" event={"ID":"c1da4990-e129-41f6-acca-138ab10c03cc","Type":"ContainerDied","Data":"664f31b7a1eb45d36abf3b766aefa3cc9aa5bc990a83996f63912922b5a3fe56"} Mar 21 05:16:08 crc kubenswrapper[4580]: I0321 05:16:08.360582 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="664f31b7a1eb45d36abf3b766aefa3cc9aa5bc990a83996f63912922b5a3fe56" Mar 21 05:16:08 crc kubenswrapper[4580]: I0321 05:16:08.360300 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jw68q" Mar 21 05:16:08 crc kubenswrapper[4580]: I0321 05:16:08.363586 4580 generic.go:334] "Generic (PLEG): container finished" podID="fe65e038-1d7f-470f-88ea-ef5352681356" containerID="f90afca2acfaaae581f27bc22e5b2a0a0d1e1168cc4e0427b1e6653bed4a6160" exitCode=0 Mar 21 05:16:08 crc kubenswrapper[4580]: I0321 05:16:08.364032 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567836-4prkc" event={"ID":"fe65e038-1d7f-470f-88ea-ef5352681356","Type":"ContainerDied","Data":"f90afca2acfaaae581f27bc22e5b2a0a0d1e1168cc4e0427b1e6653bed4a6160"} Mar 21 05:16:08 crc kubenswrapper[4580]: I0321 05:16:08.530270 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:16:08 crc kubenswrapper[4580]: I0321 05:16:08.542871 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:16:08 crc kubenswrapper[4580]: I0321 05:16:08.543111 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="09860a85-c2e1-464a-9c6e-8361a90aa306" containerName="nova-api-log" containerID="cri-o://1065a064f685760b22908d8ea4a0ac2d7ca49180bc66c81305f1069451eba26f" gracePeriod=30 Mar 21 05:16:08 crc kubenswrapper[4580]: I0321 05:16:08.543260 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="09860a85-c2e1-464a-9c6e-8361a90aa306" containerName="nova-api-api" containerID="cri-o://c6c7332aac4bc8652c8c19a5f6a8c6aa6f416ad5f15fa7dfc2cf51f177204cbe" gracePeriod=30 Mar 21 05:16:08 crc kubenswrapper[4580]: I0321 05:16:08.609417 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:16:08 crc kubenswrapper[4580]: I0321 05:16:08.609633 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4c3ff363-eba3-4c3e-ae67-6064971743bb" containerName="nova-metadata-log" containerID="cri-o://7c50325509ba7c5a7261d7c3d98923bb9e1740ac4ca9414f94835b49fbf95c01" gracePeriod=30 Mar 21 05:16:08 crc kubenswrapper[4580]: I0321 05:16:08.609747 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4c3ff363-eba3-4c3e-ae67-6064971743bb" containerName="nova-metadata-metadata" containerID="cri-o://ab91fadd278aba3406c5a2b8961a6c385e458c78adb9bc2bc25ddeb9cc772ab6" gracePeriod=30 Mar 21 05:16:09 crc kubenswrapper[4580]: I0321 05:16:09.374038 4580 generic.go:334] "Generic (PLEG): container finished" podID="09860a85-c2e1-464a-9c6e-8361a90aa306" containerID="1065a064f685760b22908d8ea4a0ac2d7ca49180bc66c81305f1069451eba26f" exitCode=143 Mar 21 05:16:09 crc kubenswrapper[4580]: I0321 05:16:09.374105 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"09860a85-c2e1-464a-9c6e-8361a90aa306","Type":"ContainerDied","Data":"1065a064f685760b22908d8ea4a0ac2d7ca49180bc66c81305f1069451eba26f"} Mar 21 05:16:09 crc kubenswrapper[4580]: I0321 05:16:09.376249 4580 generic.go:334] "Generic (PLEG): container finished" podID="4c3ff363-eba3-4c3e-ae67-6064971743bb" containerID="ab91fadd278aba3406c5a2b8961a6c385e458c78adb9bc2bc25ddeb9cc772ab6" exitCode=0 Mar 21 05:16:09 crc kubenswrapper[4580]: I0321 05:16:09.376266 4580 generic.go:334] "Generic (PLEG): container finished" podID="4c3ff363-eba3-4c3e-ae67-6064971743bb" containerID="7c50325509ba7c5a7261d7c3d98923bb9e1740ac4ca9414f94835b49fbf95c01" exitCode=143 Mar 21 05:16:09 crc kubenswrapper[4580]: I0321 05:16:09.376333 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c3ff363-eba3-4c3e-ae67-6064971743bb","Type":"ContainerDied","Data":"ab91fadd278aba3406c5a2b8961a6c385e458c78adb9bc2bc25ddeb9cc772ab6"} Mar 21 05:16:09 crc kubenswrapper[4580]: I0321 05:16:09.376404 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c3ff363-eba3-4c3e-ae67-6064971743bb","Type":"ContainerDied","Data":"7c50325509ba7c5a7261d7c3d98923bb9e1740ac4ca9414f94835b49fbf95c01"} Mar 21 05:16:09 crc kubenswrapper[4580]: I0321 05:16:09.376410 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2ac92af5-c44d-488d-a784-f028b868fc24" containerName="nova-scheduler-scheduler" containerID="cri-o://99c20bb2ab3dd9840212595c1bde9dc38fcb723ffbb4957bd9fd6b012f1769d2" gracePeriod=30 Mar 21 05:16:09 crc kubenswrapper[4580]: I0321 05:16:09.944080 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-4prkc" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.030842 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvhfs\" (UniqueName: \"kubernetes.io/projected/fe65e038-1d7f-470f-88ea-ef5352681356-kube-api-access-dvhfs\") pod \"fe65e038-1d7f-470f-88ea-ef5352681356\" (UID: \"fe65e038-1d7f-470f-88ea-ef5352681356\") " Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.056225 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe65e038-1d7f-470f-88ea-ef5352681356-kube-api-access-dvhfs" (OuterVolumeSpecName: "kube-api-access-dvhfs") pod "fe65e038-1d7f-470f-88ea-ef5352681356" (UID: "fe65e038-1d7f-470f-88ea-ef5352681356"). InnerVolumeSpecName "kube-api-access-dvhfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.134324 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvhfs\" (UniqueName: \"kubernetes.io/projected/fe65e038-1d7f-470f-88ea-ef5352681356-kube-api-access-dvhfs\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.151496 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.235810 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggfcp\" (UniqueName: \"kubernetes.io/projected/4c3ff363-eba3-4c3e-ae67-6064971743bb-kube-api-access-ggfcp\") pod \"4c3ff363-eba3-4c3e-ae67-6064971743bb\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.236252 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c3ff363-eba3-4c3e-ae67-6064971743bb-logs\") pod \"4c3ff363-eba3-4c3e-ae67-6064971743bb\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.236321 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-nova-metadata-tls-certs\") pod \"4c3ff363-eba3-4c3e-ae67-6064971743bb\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.236364 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-combined-ca-bundle\") pod \"4c3ff363-eba3-4c3e-ae67-6064971743bb\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.236413 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-config-data\") pod \"4c3ff363-eba3-4c3e-ae67-6064971743bb\" (UID: \"4c3ff363-eba3-4c3e-ae67-6064971743bb\") " Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.236630 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c3ff363-eba3-4c3e-ae67-6064971743bb-logs" (OuterVolumeSpecName: "logs") pod "4c3ff363-eba3-4c3e-ae67-6064971743bb" (UID: "4c3ff363-eba3-4c3e-ae67-6064971743bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.237184 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c3ff363-eba3-4c3e-ae67-6064971743bb-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.253899 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c3ff363-eba3-4c3e-ae67-6064971743bb-kube-api-access-ggfcp" (OuterVolumeSpecName: "kube-api-access-ggfcp") pod "4c3ff363-eba3-4c3e-ae67-6064971743bb" (UID: "4c3ff363-eba3-4c3e-ae67-6064971743bb"). InnerVolumeSpecName "kube-api-access-ggfcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.263310 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-config-data" (OuterVolumeSpecName: "config-data") pod "4c3ff363-eba3-4c3e-ae67-6064971743bb" (UID: "4c3ff363-eba3-4c3e-ae67-6064971743bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.273155 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c3ff363-eba3-4c3e-ae67-6064971743bb" (UID: "4c3ff363-eba3-4c3e-ae67-6064971743bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.288059 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4c3ff363-eba3-4c3e-ae67-6064971743bb" (UID: "4c3ff363-eba3-4c3e-ae67-6064971743bb"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.339552 4580 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.339588 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.339598 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3ff363-eba3-4c3e-ae67-6064971743bb-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.339608 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggfcp\" (UniqueName: \"kubernetes.io/projected/4c3ff363-eba3-4c3e-ae67-6064971743bb-kube-api-access-ggfcp\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.393233 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c3ff363-eba3-4c3e-ae67-6064971743bb","Type":"ContainerDied","Data":"48c0aba20f369d5ae0b409d84c823614fc8292f1b99a77efc52a585eda22d6fc"} Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.393312 4580 scope.go:117] "RemoveContainer" containerID="ab91fadd278aba3406c5a2b8961a6c385e458c78adb9bc2bc25ddeb9cc772ab6" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.393482 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.398298 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567836-4prkc" event={"ID":"fe65e038-1d7f-470f-88ea-ef5352681356","Type":"ContainerDied","Data":"1154cfd7a2e7b838d9f145a24740a586e0715d7dcca3197ea6d78e5ca1475d5d"} Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.398339 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1154cfd7a2e7b838d9f145a24740a586e0715d7dcca3197ea6d78e5ca1475d5d" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.398358 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-4prkc" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.446027 4580 scope.go:117] "RemoveContainer" containerID="7c50325509ba7c5a7261d7c3d98923bb9e1740ac4ca9414f94835b49fbf95c01" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.456896 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.469427 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.499767 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-lls59"] Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.534788 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-lls59"] Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.543459 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:16:10 crc kubenswrapper[4580]: E0321 05:16:10.543985 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1da4990-e129-41f6-acca-138ab10c03cc" containerName="nova-manage" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.544010 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1da4990-e129-41f6-acca-138ab10c03cc" containerName="nova-manage" Mar 21 05:16:10 crc kubenswrapper[4580]: E0321 05:16:10.544038 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3ff363-eba3-4c3e-ae67-6064971743bb" containerName="nova-metadata-metadata" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.544049 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3ff363-eba3-4c3e-ae67-6064971743bb" containerName="nova-metadata-metadata" Mar 21 05:16:10 crc kubenswrapper[4580]: E0321 05:16:10.544078 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe65e038-1d7f-470f-88ea-ef5352681356" containerName="oc" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.544087 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe65e038-1d7f-470f-88ea-ef5352681356" containerName="oc" Mar 21 05:16:10 crc kubenswrapper[4580]: E0321 05:16:10.544097 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3ff363-eba3-4c3e-ae67-6064971743bb" containerName="nova-metadata-log" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.544105 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3ff363-eba3-4c3e-ae67-6064971743bb" containerName="nova-metadata-log" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.544379 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c3ff363-eba3-4c3e-ae67-6064971743bb" containerName="nova-metadata-metadata" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.544413 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1da4990-e129-41f6-acca-138ab10c03cc" containerName="nova-manage" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.544430 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe65e038-1d7f-470f-88ea-ef5352681356" containerName="oc" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.544444 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c3ff363-eba3-4c3e-ae67-6064971743bb" containerName="nova-metadata-log" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.545717 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.549501 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.549703 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.553196 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.644526 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wsrh\" (UniqueName: \"kubernetes.io/projected/581fa0f6-632d-4054-b169-0aa596f21ee2-kube-api-access-6wsrh\") pod \"nova-metadata-0\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.644574 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-config-data\") pod \"nova-metadata-0\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.644807 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/581fa0f6-632d-4054-b169-0aa596f21ee2-logs\") pod \"nova-metadata-0\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.644928 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.645106 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.746751 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.746834 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wsrh\" (UniqueName: \"kubernetes.io/projected/581fa0f6-632d-4054-b169-0aa596f21ee2-kube-api-access-6wsrh\") pod \"nova-metadata-0\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.746862 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-config-data\") pod \"nova-metadata-0\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.746924 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/581fa0f6-632d-4054-b169-0aa596f21ee2-logs\") pod \"nova-metadata-0\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.746963 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.751705 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.755432 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/581fa0f6-632d-4054-b169-0aa596f21ee2-logs\") pod \"nova-metadata-0\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.759571 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.760911 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-config-data\") pod \"nova-metadata-0\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.776590 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wsrh\" (UniqueName: \"kubernetes.io/projected/581fa0f6-632d-4054-b169-0aa596f21ee2-kube-api-access-6wsrh\") pod \"nova-metadata-0\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " pod="openstack/nova-metadata-0" Mar 21 05:16:10 crc kubenswrapper[4580]: I0321 05:16:10.872752 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:16:11 crc kubenswrapper[4580]: I0321 05:16:11.404880 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:16:11 crc kubenswrapper[4580]: I0321 05:16:11.424060 4580 generic.go:334] "Generic (PLEG): container finished" podID="2ac92af5-c44d-488d-a784-f028b868fc24" containerID="99c20bb2ab3dd9840212595c1bde9dc38fcb723ffbb4957bd9fd6b012f1769d2" exitCode=0 Mar 21 05:16:11 crc kubenswrapper[4580]: I0321 05:16:11.424111 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2ac92af5-c44d-488d-a784-f028b868fc24","Type":"ContainerDied","Data":"99c20bb2ab3dd9840212595c1bde9dc38fcb723ffbb4957bd9fd6b012f1769d2"} Mar 21 05:16:11 crc kubenswrapper[4580]: E0321 05:16:11.436826 4580 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99c20bb2ab3dd9840212595c1bde9dc38fcb723ffbb4957bd9fd6b012f1769d2 is running failed: container process not found" containerID="99c20bb2ab3dd9840212595c1bde9dc38fcb723ffbb4957bd9fd6b012f1769d2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 05:16:11 crc kubenswrapper[4580]: E0321 05:16:11.437087 4580 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99c20bb2ab3dd9840212595c1bde9dc38fcb723ffbb4957bd9fd6b012f1769d2 is running failed: container process not found" containerID="99c20bb2ab3dd9840212595c1bde9dc38fcb723ffbb4957bd9fd6b012f1769d2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 05:16:11 crc kubenswrapper[4580]: E0321 05:16:11.437320 4580 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99c20bb2ab3dd9840212595c1bde9dc38fcb723ffbb4957bd9fd6b012f1769d2 is running failed: container process not found" containerID="99c20bb2ab3dd9840212595c1bde9dc38fcb723ffbb4957bd9fd6b012f1769d2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 05:16:11 crc kubenswrapper[4580]: E0321 05:16:11.437348 4580 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99c20bb2ab3dd9840212595c1bde9dc38fcb723ffbb4957bd9fd6b012f1769d2 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2ac92af5-c44d-488d-a784-f028b868fc24" containerName="nova-scheduler-scheduler" Mar 21 05:16:11 crc kubenswrapper[4580]: I0321 05:16:11.631233 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d267a81-ba86-4ddd-b83b-37ae171c6230" path="/var/lib/kubelet/pods/0d267a81-ba86-4ddd-b83b-37ae171c6230/volumes" Mar 21 05:16:11 crc kubenswrapper[4580]: I0321 05:16:11.632053 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c3ff363-eba3-4c3e-ae67-6064971743bb" path="/var/lib/kubelet/pods/4c3ff363-eba3-4c3e-ae67-6064971743bb/volumes" Mar 21 05:16:11 crc kubenswrapper[4580]: I0321 05:16:11.693091 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:16:11 crc kubenswrapper[4580]: I0321 05:16:11.778519 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ac92af5-c44d-488d-a784-f028b868fc24-combined-ca-bundle\") pod \"2ac92af5-c44d-488d-a784-f028b868fc24\" (UID: \"2ac92af5-c44d-488d-a784-f028b868fc24\") " Mar 21 05:16:11 crc kubenswrapper[4580]: I0321 05:16:11.778834 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v75kw\" (UniqueName: \"kubernetes.io/projected/2ac92af5-c44d-488d-a784-f028b868fc24-kube-api-access-v75kw\") pod \"2ac92af5-c44d-488d-a784-f028b868fc24\" (UID: \"2ac92af5-c44d-488d-a784-f028b868fc24\") " Mar 21 05:16:11 crc kubenswrapper[4580]: I0321 05:16:11.778917 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ac92af5-c44d-488d-a784-f028b868fc24-config-data\") pod \"2ac92af5-c44d-488d-a784-f028b868fc24\" (UID: \"2ac92af5-c44d-488d-a784-f028b868fc24\") " Mar 21 05:16:11 crc kubenswrapper[4580]: I0321 05:16:11.788526 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac92af5-c44d-488d-a784-f028b868fc24-kube-api-access-v75kw" (OuterVolumeSpecName: "kube-api-access-v75kw") pod "2ac92af5-c44d-488d-a784-f028b868fc24" (UID: "2ac92af5-c44d-488d-a784-f028b868fc24"). InnerVolumeSpecName "kube-api-access-v75kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:11 crc kubenswrapper[4580]: I0321 05:16:11.813732 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ac92af5-c44d-488d-a784-f028b868fc24-config-data" (OuterVolumeSpecName: "config-data") pod "2ac92af5-c44d-488d-a784-f028b868fc24" (UID: "2ac92af5-c44d-488d-a784-f028b868fc24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:11 crc kubenswrapper[4580]: I0321 05:16:11.813944 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ac92af5-c44d-488d-a784-f028b868fc24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ac92af5-c44d-488d-a784-f028b868fc24" (UID: "2ac92af5-c44d-488d-a784-f028b868fc24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:11 crc kubenswrapper[4580]: I0321 05:16:11.881178 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ac92af5-c44d-488d-a784-f028b868fc24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:11 crc kubenswrapper[4580]: I0321 05:16:11.881209 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v75kw\" (UniqueName: \"kubernetes.io/projected/2ac92af5-c44d-488d-a784-f028b868fc24-kube-api-access-v75kw\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:11 crc kubenswrapper[4580]: I0321 05:16:11.881223 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ac92af5-c44d-488d-a784-f028b868fc24-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.437839 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2ac92af5-c44d-488d-a784-f028b868fc24","Type":"ContainerDied","Data":"ea8ee33184cddaafb822f4c7404bea3dae7678033b54da84a868b7bcf1cd4393"} Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.438068 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.438091 4580 scope.go:117] "RemoveContainer" containerID="99c20bb2ab3dd9840212595c1bde9dc38fcb723ffbb4957bd9fd6b012f1769d2" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.459147 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"581fa0f6-632d-4054-b169-0aa596f21ee2","Type":"ContainerStarted","Data":"34f8d2685657be65d92451ad9b924288cba4bdb608f7819146a0cc4763321bdc"} Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.459217 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"581fa0f6-632d-4054-b169-0aa596f21ee2","Type":"ContainerStarted","Data":"9e0feabf215b5d2e86b8ff1fdce0317ad5f4dcd5a2a8e7db8ec640dadb754b9e"} Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.459251 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"581fa0f6-632d-4054-b169-0aa596f21ee2","Type":"ContainerStarted","Data":"77ffecf8de8a5a2b8b0b0410a2b95a91eacc0b72a2d23a0985408f233d20acd4"} Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.503774 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.5037461690000002 podStartE2EDuration="2.503746169s" podCreationTimestamp="2026-03-21 05:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:16:12.502561287 +0000 UTC m=+1477.585144925" watchObservedRunningTime="2026-03-21 05:16:12.503746169 +0000 UTC m=+1477.586329797" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.570069 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.586661 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.601004 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:16:12 crc kubenswrapper[4580]: E0321 05:16:12.601685 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac92af5-c44d-488d-a784-f028b868fc24" containerName="nova-scheduler-scheduler" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.601709 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac92af5-c44d-488d-a784-f028b868fc24" containerName="nova-scheduler-scheduler" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.602203 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ac92af5-c44d-488d-a784-f028b868fc24" containerName="nova-scheduler-scheduler" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.603504 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.615165 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.616819 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.736660 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-config-data\") pod \"nova-scheduler-0\" (UID: \"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.737057 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.737611 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swnhm\" (UniqueName: \"kubernetes.io/projected/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-kube-api-access-swnhm\") pod \"nova-scheduler-0\" (UID: \"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.771328 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-khbvh"] Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.773282 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.798541 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khbvh"] Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.840049 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swnhm\" (UniqueName: \"kubernetes.io/projected/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-kube-api-access-swnhm\") pod \"nova-scheduler-0\" (UID: \"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.840159 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c46a62-0353-48cd-8aa9-d23f3fb2e000-catalog-content\") pod \"redhat-operators-khbvh\" (UID: \"79c46a62-0353-48cd-8aa9-d23f3fb2e000\") " pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.840205 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-config-data\") pod \"nova-scheduler-0\" (UID: \"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.840251 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c46a62-0353-48cd-8aa9-d23f3fb2e000-utilities\") pod \"redhat-operators-khbvh\" (UID: \"79c46a62-0353-48cd-8aa9-d23f3fb2e000\") " pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.840278 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmlc9\" (UniqueName: \"kubernetes.io/projected/79c46a62-0353-48cd-8aa9-d23f3fb2e000-kube-api-access-nmlc9\") pod \"redhat-operators-khbvh\" (UID: \"79c46a62-0353-48cd-8aa9-d23f3fb2e000\") " pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.840310 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.846376 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.857008 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-config-data\") pod \"nova-scheduler-0\" (UID: \"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.857836 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swnhm\" (UniqueName: \"kubernetes.io/projected/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-kube-api-access-swnhm\") pod \"nova-scheduler-0\" (UID: \"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.947493 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.948295 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c46a62-0353-48cd-8aa9-d23f3fb2e000-catalog-content\") pod \"redhat-operators-khbvh\" (UID: \"79c46a62-0353-48cd-8aa9-d23f3fb2e000\") " pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.948390 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c46a62-0353-48cd-8aa9-d23f3fb2e000-utilities\") pod \"redhat-operators-khbvh\" (UID: \"79c46a62-0353-48cd-8aa9-d23f3fb2e000\") " pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.948424 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmlc9\" (UniqueName: \"kubernetes.io/projected/79c46a62-0353-48cd-8aa9-d23f3fb2e000-kube-api-access-nmlc9\") pod \"redhat-operators-khbvh\" (UID: \"79c46a62-0353-48cd-8aa9-d23f3fb2e000\") " pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.949312 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c46a62-0353-48cd-8aa9-d23f3fb2e000-catalog-content\") pod \"redhat-operators-khbvh\" (UID: \"79c46a62-0353-48cd-8aa9-d23f3fb2e000\") " pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.949527 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c46a62-0353-48cd-8aa9-d23f3fb2e000-utilities\") pod \"redhat-operators-khbvh\" (UID: \"79c46a62-0353-48cd-8aa9-d23f3fb2e000\") " pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:16:12 crc kubenswrapper[4580]: I0321 05:16:12.987231 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmlc9\" (UniqueName: \"kubernetes.io/projected/79c46a62-0353-48cd-8aa9-d23f3fb2e000-kube-api-access-nmlc9\") pod \"redhat-operators-khbvh\" (UID: \"79c46a62-0353-48cd-8aa9-d23f3fb2e000\") " pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:16:13 crc kubenswrapper[4580]: I0321 05:16:13.088490 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:16:13 crc kubenswrapper[4580]: I0321 05:16:13.512087 4580 generic.go:334] "Generic (PLEG): container finished" podID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerID="f660370a5b85c0757c978411d4b13c5ed188b23f7b881d8e81f31c5eac41a537" exitCode=0 Mar 21 05:16:13 crc kubenswrapper[4580]: I0321 05:16:13.512109 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67655f8b6-mbx6n" event={"ID":"a03ce0fa-f7e8-4b48-bbea-95807f14dd26","Type":"ContainerDied","Data":"f660370a5b85c0757c978411d4b13c5ed188b23f7b881d8e81f31c5eac41a537"} Mar 21 05:16:13 crc kubenswrapper[4580]: I0321 05:16:13.512485 4580 scope.go:117] "RemoveContainer" containerID="b1910d7dc39d75c560d1ecb55908d0c4f510cbbee17323265da8706ab45dadba" Mar 21 05:16:13 crc kubenswrapper[4580]: I0321 05:16:13.636226 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ac92af5-c44d-488d-a784-f028b868fc24" path="/var/lib/kubelet/pods/2ac92af5-c44d-488d-a784-f028b868fc24/volumes" Mar 21 05:16:13 crc kubenswrapper[4580]: I0321 05:16:13.636900 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:16:13 crc kubenswrapper[4580]: I0321 05:16:13.941147 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khbvh"] Mar 21 05:16:13 crc kubenswrapper[4580]: W0321 05:16:13.947741 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79c46a62_0353_48cd_8aa9_d23f3fb2e000.slice/crio-275587f7f0db2cefee8ec2c3643f7b8294518d126112109740542e6f98522cf7 WatchSource:0}: Error finding container 275587f7f0db2cefee8ec2c3643f7b8294518d126112109740542e6f98522cf7: Status 404 returned error can't find the container with id 275587f7f0db2cefee8ec2c3643f7b8294518d126112109740542e6f98522cf7 Mar 21 05:16:14 crc kubenswrapper[4580]: I0321 05:16:14.523425 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c","Type":"ContainerStarted","Data":"7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb"} Mar 21 05:16:14 crc kubenswrapper[4580]: I0321 05:16:14.524532 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c","Type":"ContainerStarted","Data":"41e2e6829e17430fd7674d99fe9a0d7ef416ae9f83bc7708cfec948e68a31aeb"} Mar 21 05:16:14 crc kubenswrapper[4580]: I0321 05:16:14.527035 4580 generic.go:334] "Generic (PLEG): container finished" podID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerID="3584b7d891fb48aaccc82e7f27f929124e903ac471bf4157fb2950d6ceca0b5d" exitCode=0 Mar 21 05:16:14 crc kubenswrapper[4580]: I0321 05:16:14.528001 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khbvh" event={"ID":"79c46a62-0353-48cd-8aa9-d23f3fb2e000","Type":"ContainerDied","Data":"3584b7d891fb48aaccc82e7f27f929124e903ac471bf4157fb2950d6ceca0b5d"} Mar 21 05:16:14 crc kubenswrapper[4580]: I0321 05:16:14.528021 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khbvh" event={"ID":"79c46a62-0353-48cd-8aa9-d23f3fb2e000","Type":"ContainerStarted","Data":"275587f7f0db2cefee8ec2c3643f7b8294518d126112109740542e6f98522cf7"} Mar 21 05:16:14 crc kubenswrapper[4580]: I0321 05:16:14.535236 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67655f8b6-mbx6n" event={"ID":"a03ce0fa-f7e8-4b48-bbea-95807f14dd26","Type":"ContainerStarted","Data":"8dec5c044edc1705690a7eafd8a1c1f2fb3f54df8a14dee933c7e1786ce58f44"} Mar 21 05:16:14 crc kubenswrapper[4580]: I0321 05:16:14.560558 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.560533217 podStartE2EDuration="2.560533217s" podCreationTimestamp="2026-03-21 05:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:16:14.547848598 +0000 UTC m=+1479.630432226" watchObservedRunningTime="2026-03-21 05:16:14.560533217 +0000 UTC m=+1479.643116845" Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.563075 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khbvh" event={"ID":"79c46a62-0353-48cd-8aa9-d23f3fb2e000","Type":"ContainerStarted","Data":"3eaf33415ff84dc8582fb75f7b658a58f4831e77d9d7931f76adc6aba6dd3735"} Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.567992 4580 generic.go:334] "Generic (PLEG): container finished" podID="09860a85-c2e1-464a-9c6e-8361a90aa306" containerID="c6c7332aac4bc8652c8c19a5f6a8c6aa6f416ad5f15fa7dfc2cf51f177204cbe" exitCode=0 Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.568676 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"09860a85-c2e1-464a-9c6e-8361a90aa306","Type":"ContainerDied","Data":"c6c7332aac4bc8652c8c19a5f6a8c6aa6f416ad5f15fa7dfc2cf51f177204cbe"} Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.725468 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.837272 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7dw7\" (UniqueName: \"kubernetes.io/projected/09860a85-c2e1-464a-9c6e-8361a90aa306-kube-api-access-b7dw7\") pod \"09860a85-c2e1-464a-9c6e-8361a90aa306\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.837354 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09860a85-c2e1-464a-9c6e-8361a90aa306-config-data\") pod \"09860a85-c2e1-464a-9c6e-8361a90aa306\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.837453 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09860a85-c2e1-464a-9c6e-8361a90aa306-combined-ca-bundle\") pod \"09860a85-c2e1-464a-9c6e-8361a90aa306\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.837529 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09860a85-c2e1-464a-9c6e-8361a90aa306-logs\") pod \"09860a85-c2e1-464a-9c6e-8361a90aa306\" (UID: \"09860a85-c2e1-464a-9c6e-8361a90aa306\") " Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.837914 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09860a85-c2e1-464a-9c6e-8361a90aa306-logs" (OuterVolumeSpecName: "logs") pod "09860a85-c2e1-464a-9c6e-8361a90aa306" (UID: "09860a85-c2e1-464a-9c6e-8361a90aa306"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.838348 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09860a85-c2e1-464a-9c6e-8361a90aa306-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.849982 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09860a85-c2e1-464a-9c6e-8361a90aa306-kube-api-access-b7dw7" (OuterVolumeSpecName: "kube-api-access-b7dw7") pod "09860a85-c2e1-464a-9c6e-8361a90aa306" (UID: "09860a85-c2e1-464a-9c6e-8361a90aa306"). InnerVolumeSpecName "kube-api-access-b7dw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.869804 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09860a85-c2e1-464a-9c6e-8361a90aa306-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09860a85-c2e1-464a-9c6e-8361a90aa306" (UID: "09860a85-c2e1-464a-9c6e-8361a90aa306"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.879897 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09860a85-c2e1-464a-9c6e-8361a90aa306-config-data" (OuterVolumeSpecName: "config-data") pod "09860a85-c2e1-464a-9c6e-8361a90aa306" (UID: "09860a85-c2e1-464a-9c6e-8361a90aa306"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.939751 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7dw7\" (UniqueName: \"kubernetes.io/projected/09860a85-c2e1-464a-9c6e-8361a90aa306-kube-api-access-b7dw7\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.939809 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09860a85-c2e1-464a-9c6e-8361a90aa306-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:16 crc kubenswrapper[4580]: I0321 05:16:16.939819 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09860a85-c2e1-464a-9c6e-8361a90aa306-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.580003 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"09860a85-c2e1-464a-9c6e-8361a90aa306","Type":"ContainerDied","Data":"125be47a6270d51294791dbb5370a2266866b564c4a8543558320e0d18d2ee02"} Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.580327 4580 scope.go:117] "RemoveContainer" containerID="c6c7332aac4bc8652c8c19a5f6a8c6aa6f416ad5f15fa7dfc2cf51f177204cbe" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.580096 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.617772 4580 scope.go:117] "RemoveContainer" containerID="1065a064f685760b22908d8ea4a0ac2d7ca49180bc66c81305f1069451eba26f" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.629487 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.641971 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.681736 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 05:16:17 crc kubenswrapper[4580]: E0321 05:16:17.682124 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09860a85-c2e1-464a-9c6e-8361a90aa306" containerName="nova-api-api" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.682140 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="09860a85-c2e1-464a-9c6e-8361a90aa306" containerName="nova-api-api" Mar 21 05:16:17 crc kubenswrapper[4580]: E0321 05:16:17.682178 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09860a85-c2e1-464a-9c6e-8361a90aa306" containerName="nova-api-log" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.682185 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="09860a85-c2e1-464a-9c6e-8361a90aa306" containerName="nova-api-log" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.682342 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="09860a85-c2e1-464a-9c6e-8361a90aa306" containerName="nova-api-log" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.682369 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="09860a85-c2e1-464a-9c6e-8361a90aa306" containerName="nova-api-api" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.683240 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.685434 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.694949 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.760845 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a99bb9ba-6aab-404f-9204-ede663d5478a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " pod="openstack/nova-api-0" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.760886 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a99bb9ba-6aab-404f-9204-ede663d5478a-logs\") pod \"nova-api-0\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " pod="openstack/nova-api-0" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.760999 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a99bb9ba-6aab-404f-9204-ede663d5478a-config-data\") pod \"nova-api-0\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " pod="openstack/nova-api-0" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.761022 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl4l7\" (UniqueName: \"kubernetes.io/projected/a99bb9ba-6aab-404f-9204-ede663d5478a-kube-api-access-tl4l7\") pod \"nova-api-0\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " pod="openstack/nova-api-0" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.862827 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a99bb9ba-6aab-404f-9204-ede663d5478a-config-data\") pod \"nova-api-0\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " pod="openstack/nova-api-0" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.862885 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl4l7\" (UniqueName: \"kubernetes.io/projected/a99bb9ba-6aab-404f-9204-ede663d5478a-kube-api-access-tl4l7\") pod \"nova-api-0\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " pod="openstack/nova-api-0" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.863001 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a99bb9ba-6aab-404f-9204-ede663d5478a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " pod="openstack/nova-api-0" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.863034 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a99bb9ba-6aab-404f-9204-ede663d5478a-logs\") pod \"nova-api-0\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " pod="openstack/nova-api-0" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.863521 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a99bb9ba-6aab-404f-9204-ede663d5478a-logs\") pod \"nova-api-0\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " pod="openstack/nova-api-0" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.882470 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a99bb9ba-6aab-404f-9204-ede663d5478a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " pod="openstack/nova-api-0" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.882775 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a99bb9ba-6aab-404f-9204-ede663d5478a-config-data\") pod \"nova-api-0\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " pod="openstack/nova-api-0" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.894394 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl4l7\" (UniqueName: \"kubernetes.io/projected/a99bb9ba-6aab-404f-9204-ede663d5478a-kube-api-access-tl4l7\") pod \"nova-api-0\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " pod="openstack/nova-api-0" Mar 21 05:16:17 crc kubenswrapper[4580]: I0321 05:16:17.948344 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 21 05:16:18 crc kubenswrapper[4580]: I0321 05:16:18.026936 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:16:18 crc kubenswrapper[4580]: I0321 05:16:18.504140 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:16:18 crc kubenswrapper[4580]: I0321 05:16:18.594350 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a99bb9ba-6aab-404f-9204-ede663d5478a","Type":"ContainerStarted","Data":"c4ea18cd7a39f00b1dcfd8d73390f77bcc3ae37ddedd7aa7c695b48888b72772"} Mar 21 05:16:19 crc kubenswrapper[4580]: I0321 05:16:19.607642 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a99bb9ba-6aab-404f-9204-ede663d5478a","Type":"ContainerStarted","Data":"ca159f031f571279af5c9bdf3de0c475feba0b4875c8d41399ee4c2de895b22f"} Mar 21 05:16:19 crc kubenswrapper[4580]: I0321 05:16:19.608242 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a99bb9ba-6aab-404f-9204-ede663d5478a","Type":"ContainerStarted","Data":"9d7d92edd71d04e4747bae1bda0613e5146b1d2fb4d9c756546c6e6d877ea8e1"} Mar 21 05:16:19 crc kubenswrapper[4580]: I0321 05:16:19.629434 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09860a85-c2e1-464a-9c6e-8361a90aa306" path="/var/lib/kubelet/pods/09860a85-c2e1-464a-9c6e-8361a90aa306/volumes" Mar 21 05:16:19 crc kubenswrapper[4580]: I0321 05:16:19.633870 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.633854092 podStartE2EDuration="2.633854092s" podCreationTimestamp="2026-03-21 05:16:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:16:19.626500166 +0000 UTC m=+1484.709083794" watchObservedRunningTime="2026-03-21 05:16:19.633854092 +0000 UTC m=+1484.716437720" Mar 21 05:16:20 crc kubenswrapper[4580]: I0321 05:16:20.873675 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 05:16:20 crc kubenswrapper[4580]: I0321 05:16:20.873737 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 05:16:21 crc kubenswrapper[4580]: I0321 05:16:21.507474 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:16:21 crc kubenswrapper[4580]: I0321 05:16:21.508240 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:16:21 crc kubenswrapper[4580]: I0321 05:16:21.632966 4580 generic.go:334] "Generic (PLEG): container finished" podID="08a0110f-428a-481d-b439-bc16e6837dc3" containerID="aabef473a7fedba8211603363e3d7574d3a24bbc5ab5b0fe74504ddddca72333" exitCode=137 Mar 21 05:16:21 crc kubenswrapper[4580]: I0321 05:16:21.633002 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587cfc8688-265kc" event={"ID":"08a0110f-428a-481d-b439-bc16e6837dc3","Type":"ContainerDied","Data":"aabef473a7fedba8211603363e3d7574d3a24bbc5ab5b0fe74504ddddca72333"} Mar 21 05:16:21 crc kubenswrapper[4580]: I0321 05:16:21.633030 4580 scope.go:117] "RemoveContainer" containerID="f8ab4ef90bd31d20c6033eb943d9a0a9a88a0d10339df1ff4a08e1b1232fe783" Mar 21 05:16:21 crc kubenswrapper[4580]: I0321 05:16:21.887955 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="581fa0f6-632d-4054-b169-0aa596f21ee2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 05:16:21 crc kubenswrapper[4580]: I0321 05:16:21.887992 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="581fa0f6-632d-4054-b169-0aa596f21ee2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 05:16:22 crc kubenswrapper[4580]: I0321 05:16:22.644534 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587cfc8688-265kc" event={"ID":"08a0110f-428a-481d-b439-bc16e6837dc3","Type":"ContainerStarted","Data":"1da18f5c3b92ba4606247cf9c331bf50366459fc361b2b8533adb37a81f66e54"} Mar 21 05:16:22 crc kubenswrapper[4580]: I0321 05:16:22.948142 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 21 05:16:22 crc kubenswrapper[4580]: I0321 05:16:22.979368 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 21 05:16:23 crc kubenswrapper[4580]: I0321 05:16:23.684654 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 21 05:16:26 crc kubenswrapper[4580]: I0321 05:16:26.680672 4580 generic.go:334] "Generic (PLEG): container finished" podID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerID="3eaf33415ff84dc8582fb75f7b658a58f4831e77d9d7931f76adc6aba6dd3735" exitCode=0 Mar 21 05:16:26 crc kubenswrapper[4580]: I0321 05:16:26.681938 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khbvh" event={"ID":"79c46a62-0353-48cd-8aa9-d23f3fb2e000","Type":"ContainerDied","Data":"3eaf33415ff84dc8582fb75f7b658a58f4831e77d9d7931f76adc6aba6dd3735"} Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.600050 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.692297 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-config-data\") pod \"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5\" (UID: \"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5\") " Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.692409 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-combined-ca-bundle\") pod \"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5\" (UID: \"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5\") " Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.692516 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz7p8\" (UniqueName: \"kubernetes.io/projected/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-kube-api-access-qz7p8\") pod \"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5\" (UID: \"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5\") " Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.699673 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khbvh" event={"ID":"79c46a62-0353-48cd-8aa9-d23f3fb2e000","Type":"ContainerStarted","Data":"5974694a009b237df8238ef61eb6abede38a0a3f7dd81f1f4c12e21a4d08c570"} Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.712754 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-kube-api-access-qz7p8" (OuterVolumeSpecName: "kube-api-access-qz7p8") pod "5c7c9415-46a1-40aa-96bc-4d3910f0c5c5" (UID: "5c7c9415-46a1-40aa-96bc-4d3910f0c5c5"). InnerVolumeSpecName "kube-api-access-qz7p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.714066 4580 generic.go:334] "Generic (PLEG): container finished" podID="5c7c9415-46a1-40aa-96bc-4d3910f0c5c5" containerID="6cce9a7d51a1669c0bc276a7b641ea973675dc3f163e477f746f96a06fb08257" exitCode=137 Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.714111 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5","Type":"ContainerDied","Data":"6cce9a7d51a1669c0bc276a7b641ea973675dc3f163e477f746f96a06fb08257"} Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.714143 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5c7c9415-46a1-40aa-96bc-4d3910f0c5c5","Type":"ContainerDied","Data":"41795fa327da864496db63d0c17450c756f67b6cdeb4ed1bb38ad0a94c8812d4"} Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.714160 4580 scope.go:117] "RemoveContainer" containerID="6cce9a7d51a1669c0bc276a7b641ea973675dc3f163e477f746f96a06fb08257" Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.714282 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.735723 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c7c9415-46a1-40aa-96bc-4d3910f0c5c5" (UID: "5c7c9415-46a1-40aa-96bc-4d3910f0c5c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.754047 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-khbvh" podStartSLOduration=3.107769632 podStartE2EDuration="15.75402246s" podCreationTimestamp="2026-03-21 05:16:12 +0000 UTC" firstStartedPulling="2026-03-21 05:16:14.52881775 +0000 UTC m=+1479.611401368" lastFinishedPulling="2026-03-21 05:16:27.175070568 +0000 UTC m=+1492.257654196" observedRunningTime="2026-03-21 05:16:27.728827973 +0000 UTC m=+1492.811411621" watchObservedRunningTime="2026-03-21 05:16:27.75402246 +0000 UTC m=+1492.836606078" Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.778405 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-config-data" (OuterVolumeSpecName: "config-data") pod "5c7c9415-46a1-40aa-96bc-4d3910f0c5c5" (UID: "5c7c9415-46a1-40aa-96bc-4d3910f0c5c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.780513 4580 scope.go:117] "RemoveContainer" containerID="6cce9a7d51a1669c0bc276a7b641ea973675dc3f163e477f746f96a06fb08257" Mar 21 05:16:27 crc kubenswrapper[4580]: E0321 05:16:27.782149 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cce9a7d51a1669c0bc276a7b641ea973675dc3f163e477f746f96a06fb08257\": container with ID starting with 6cce9a7d51a1669c0bc276a7b641ea973675dc3f163e477f746f96a06fb08257 not found: ID does not exist" containerID="6cce9a7d51a1669c0bc276a7b641ea973675dc3f163e477f746f96a06fb08257" Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.782197 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cce9a7d51a1669c0bc276a7b641ea973675dc3f163e477f746f96a06fb08257"} err="failed to get container status \"6cce9a7d51a1669c0bc276a7b641ea973675dc3f163e477f746f96a06fb08257\": rpc error: code = NotFound desc = could not find container \"6cce9a7d51a1669c0bc276a7b641ea973675dc3f163e477f746f96a06fb08257\": container with ID starting with 6cce9a7d51a1669c0bc276a7b641ea973675dc3f163e477f746f96a06fb08257 not found: ID does not exist" Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.794924 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz7p8\" (UniqueName: \"kubernetes.io/projected/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-kube-api-access-qz7p8\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.794982 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:27 crc kubenswrapper[4580]: I0321 05:16:27.794992 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.027555 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.028204 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.052223 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.073661 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.097887 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:16:28 crc kubenswrapper[4580]: E0321 05:16:28.098343 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7c9415-46a1-40aa-96bc-4d3910f0c5c5" containerName="nova-cell1-novncproxy-novncproxy" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.098361 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7c9415-46a1-40aa-96bc-4d3910f0c5c5" containerName="nova-cell1-novncproxy-novncproxy" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.098540 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c7c9415-46a1-40aa-96bc-4d3910f0c5c5" containerName="nova-cell1-novncproxy-novncproxy" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.099680 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.103497 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.103757 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.103933 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.110252 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.202175 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7bd64a0-ec65-4f8c-841c-ca1950434439-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7bd64a0-ec65-4f8c-841c-ca1950434439\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.202482 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q244\" (UniqueName: \"kubernetes.io/projected/b7bd64a0-ec65-4f8c-841c-ca1950434439-kube-api-access-2q244\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7bd64a0-ec65-4f8c-841c-ca1950434439\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.202618 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7bd64a0-ec65-4f8c-841c-ca1950434439-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7bd64a0-ec65-4f8c-841c-ca1950434439\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.202916 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7bd64a0-ec65-4f8c-841c-ca1950434439-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7bd64a0-ec65-4f8c-841c-ca1950434439\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.203025 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7bd64a0-ec65-4f8c-841c-ca1950434439-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7bd64a0-ec65-4f8c-841c-ca1950434439\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.304817 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7bd64a0-ec65-4f8c-841c-ca1950434439-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7bd64a0-ec65-4f8c-841c-ca1950434439\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.304886 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q244\" (UniqueName: \"kubernetes.io/projected/b7bd64a0-ec65-4f8c-841c-ca1950434439-kube-api-access-2q244\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7bd64a0-ec65-4f8c-841c-ca1950434439\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.304940 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7bd64a0-ec65-4f8c-841c-ca1950434439-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7bd64a0-ec65-4f8c-841c-ca1950434439\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.305038 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7bd64a0-ec65-4f8c-841c-ca1950434439-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7bd64a0-ec65-4f8c-841c-ca1950434439\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.305062 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7bd64a0-ec65-4f8c-841c-ca1950434439-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7bd64a0-ec65-4f8c-841c-ca1950434439\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.313983 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7bd64a0-ec65-4f8c-841c-ca1950434439-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7bd64a0-ec65-4f8c-841c-ca1950434439\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.314468 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7bd64a0-ec65-4f8c-841c-ca1950434439-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7bd64a0-ec65-4f8c-841c-ca1950434439\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.314492 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7bd64a0-ec65-4f8c-841c-ca1950434439-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7bd64a0-ec65-4f8c-841c-ca1950434439\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.318651 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7bd64a0-ec65-4f8c-841c-ca1950434439-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7bd64a0-ec65-4f8c-841c-ca1950434439\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.353007 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q244\" (UniqueName: \"kubernetes.io/projected/b7bd64a0-ec65-4f8c-841c-ca1950434439-kube-api-access-2q244\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7bd64a0-ec65-4f8c-841c-ca1950434439\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.432101 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.869433 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.874255 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 05:16:28 crc kubenswrapper[4580]: I0321 05:16:28.875269 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 05:16:29 crc kubenswrapper[4580]: I0321 05:16:29.109987 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a99bb9ba-6aab-404f-9204-ede663d5478a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:16:29 crc kubenswrapper[4580]: I0321 05:16:29.110909 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a99bb9ba-6aab-404f-9204-ede663d5478a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:16:29 crc kubenswrapper[4580]: I0321 05:16:29.654451 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c7c9415-46a1-40aa-96bc-4d3910f0c5c5" path="/var/lib/kubelet/pods/5c7c9415-46a1-40aa-96bc-4d3910f0c5c5/volumes" Mar 21 05:16:29 crc kubenswrapper[4580]: I0321 05:16:29.753263 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7bd64a0-ec65-4f8c-841c-ca1950434439","Type":"ContainerStarted","Data":"c843003b639e505df3d52aea1f619a2766c6b884c5fac1278f14178860a2b7bb"} Mar 21 05:16:29 crc kubenswrapper[4580]: I0321 05:16:29.753321 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7bd64a0-ec65-4f8c-841c-ca1950434439","Type":"ContainerStarted","Data":"9ebb4b15dba305be8be7206ed6184dade3716c43a9100c111e5f526899e18888"} Mar 21 05:16:29 crc kubenswrapper[4580]: I0321 05:16:29.775315 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.775298835 podStartE2EDuration="1.775298835s" podCreationTimestamp="2026-03-21 05:16:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:16:29.772524077 +0000 UTC m=+1494.855107715" watchObservedRunningTime="2026-03-21 05:16:29.775298835 +0000 UTC m=+1494.857882463" Mar 21 05:16:30 crc kubenswrapper[4580]: I0321 05:16:30.765721 4580 generic.go:334] "Generic (PLEG): container finished" podID="394de333-b465-45df-8251-bb4ae573b135" containerID="9d272dbb0e955717fe563a21e66e5f4aa48f6d4a9f8f73530e65ba4d4bc33129" exitCode=0 Mar 21 05:16:30 crc kubenswrapper[4580]: I0321 05:16:30.765822 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6xjtd" event={"ID":"394de333-b465-45df-8251-bb4ae573b135","Type":"ContainerDied","Data":"9d272dbb0e955717fe563a21e66e5f4aa48f6d4a9f8f73530e65ba4d4bc33129"} Mar 21 05:16:30 crc kubenswrapper[4580]: I0321 05:16:30.886485 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 05:16:30 crc kubenswrapper[4580]: I0321 05:16:30.892894 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 05:16:30 crc kubenswrapper[4580]: I0321 05:16:30.939405 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 05:16:31 crc kubenswrapper[4580]: I0321 05:16:31.393610 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:16:31 crc kubenswrapper[4580]: I0321 05:16:31.393661 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:16:31 crc kubenswrapper[4580]: I0321 05:16:31.395444 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:16:31 crc kubenswrapper[4580]: I0321 05:16:31.509118 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 21 05:16:31 crc kubenswrapper[4580]: I0321 05:16:31.785840 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.426899 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.501758 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-combined-ca-bundle\") pod \"394de333-b465-45df-8251-bb4ae573b135\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.502044 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-scripts\") pod \"394de333-b465-45df-8251-bb4ae573b135\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.502138 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts7jz\" (UniqueName: \"kubernetes.io/projected/394de333-b465-45df-8251-bb4ae573b135-kube-api-access-ts7jz\") pod \"394de333-b465-45df-8251-bb4ae573b135\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.502258 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-config-data\") pod \"394de333-b465-45df-8251-bb4ae573b135\" (UID: \"394de333-b465-45df-8251-bb4ae573b135\") " Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.540195 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394de333-b465-45df-8251-bb4ae573b135-kube-api-access-ts7jz" (OuterVolumeSpecName: "kube-api-access-ts7jz") pod "394de333-b465-45df-8251-bb4ae573b135" (UID: "394de333-b465-45df-8251-bb4ae573b135"). InnerVolumeSpecName "kube-api-access-ts7jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.540626 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-scripts" (OuterVolumeSpecName: "scripts") pod "394de333-b465-45df-8251-bb4ae573b135" (UID: "394de333-b465-45df-8251-bb4ae573b135"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.546722 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "394de333-b465-45df-8251-bb4ae573b135" (UID: "394de333-b465-45df-8251-bb4ae573b135"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.549896 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-config-data" (OuterVolumeSpecName: "config-data") pod "394de333-b465-45df-8251-bb4ae573b135" (UID: "394de333-b465-45df-8251-bb4ae573b135"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.604405 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.604441 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.604453 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/394de333-b465-45df-8251-bb4ae573b135-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.604461 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts7jz\" (UniqueName: \"kubernetes.io/projected/394de333-b465-45df-8251-bb4ae573b135-kube-api-access-ts7jz\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.785897 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6xjtd" event={"ID":"394de333-b465-45df-8251-bb4ae573b135","Type":"ContainerDied","Data":"573152bced2baf0e80a57c250c00727b0a4d795288824ba98b4330247554e32a"} Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.785935 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6xjtd" Mar 21 05:16:32 crc kubenswrapper[4580]: I0321 05:16:32.785955 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="573152bced2baf0e80a57c250c00727b0a4d795288824ba98b4330247554e32a" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.007874 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 21 05:16:33 crc kubenswrapper[4580]: E0321 05:16:33.008410 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394de333-b465-45df-8251-bb4ae573b135" containerName="nova-cell1-conductor-db-sync" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.008435 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="394de333-b465-45df-8251-bb4ae573b135" containerName="nova-cell1-conductor-db-sync" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.008700 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="394de333-b465-45df-8251-bb4ae573b135" containerName="nova-cell1-conductor-db-sync" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.009613 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.017761 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.022525 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.090001 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.090055 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.112501 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6qjv\" (UniqueName: \"kubernetes.io/projected/fcc0c177-5dea-46ab-9eb9-aa66a23d909f-kube-api-access-k6qjv\") pod \"nova-cell1-conductor-0\" (UID: \"fcc0c177-5dea-46ab-9eb9-aa66a23d909f\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.112638 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcc0c177-5dea-46ab-9eb9-aa66a23d909f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fcc0c177-5dea-46ab-9eb9-aa66a23d909f\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.112660 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc0c177-5dea-46ab-9eb9-aa66a23d909f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fcc0c177-5dea-46ab-9eb9-aa66a23d909f\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.215523 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcc0c177-5dea-46ab-9eb9-aa66a23d909f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fcc0c177-5dea-46ab-9eb9-aa66a23d909f\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.215568 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc0c177-5dea-46ab-9eb9-aa66a23d909f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fcc0c177-5dea-46ab-9eb9-aa66a23d909f\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.215666 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6qjv\" (UniqueName: \"kubernetes.io/projected/fcc0c177-5dea-46ab-9eb9-aa66a23d909f-kube-api-access-k6qjv\") pod \"nova-cell1-conductor-0\" (UID: \"fcc0c177-5dea-46ab-9eb9-aa66a23d909f\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.220601 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcc0c177-5dea-46ab-9eb9-aa66a23d909f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fcc0c177-5dea-46ab-9eb9-aa66a23d909f\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.221238 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc0c177-5dea-46ab-9eb9-aa66a23d909f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fcc0c177-5dea-46ab-9eb9-aa66a23d909f\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.258300 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6qjv\" (UniqueName: \"kubernetes.io/projected/fcc0c177-5dea-46ab-9eb9-aa66a23d909f-kube-api-access-k6qjv\") pod \"nova-cell1-conductor-0\" (UID: \"fcc0c177-5dea-46ab-9eb9-aa66a23d909f\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.330220 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.433095 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:33 crc kubenswrapper[4580]: I0321 05:16:33.860731 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 21 05:16:34 crc kubenswrapper[4580]: I0321 05:16:34.140725 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-khbvh" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="registry-server" probeResult="failure" output=< Mar 21 05:16:34 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:16:34 crc kubenswrapper[4580]: > Mar 21 05:16:34 crc kubenswrapper[4580]: I0321 05:16:34.804325 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"fcc0c177-5dea-46ab-9eb9-aa66a23d909f","Type":"ContainerStarted","Data":"0551c559908e74939118351e4a13db347228cecf831b3b0d2107096a36b24b60"} Mar 21 05:16:34 crc kubenswrapper[4580]: I0321 05:16:34.804370 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"fcc0c177-5dea-46ab-9eb9-aa66a23d909f","Type":"ContainerStarted","Data":"52c4ba34e0753d081748aa0c1200b5fe53547089196bd28caac99e91f1010f7b"} Mar 21 05:16:34 crc kubenswrapper[4580]: I0321 05:16:34.804507 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 21 05:16:34 crc kubenswrapper[4580]: I0321 05:16:34.828216 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.828195062 podStartE2EDuration="2.828195062s" podCreationTimestamp="2026-03-21 05:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:16:34.821840334 +0000 UTC m=+1499.904423982" watchObservedRunningTime="2026-03-21 05:16:34.828195062 +0000 UTC m=+1499.910778710" Mar 21 05:16:36 crc kubenswrapper[4580]: I0321 05:16:36.027652 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 05:16:36 crc kubenswrapper[4580]: I0321 05:16:36.028473 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 05:16:38 crc kubenswrapper[4580]: I0321 05:16:38.032667 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 05:16:38 crc kubenswrapper[4580]: I0321 05:16:38.035642 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 05:16:38 crc kubenswrapper[4580]: I0321 05:16:38.036249 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 05:16:38 crc kubenswrapper[4580]: I0321 05:16:38.432377 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:38 crc kubenswrapper[4580]: I0321 05:16:38.471733 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:38 crc kubenswrapper[4580]: I0321 05:16:38.844747 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 05:16:38 crc kubenswrapper[4580]: I0321 05:16:38.879696 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.212019 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-bbsh9"] Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.217318 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.260274 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-bbsh9"] Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.348310 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-config\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.348358 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.348416 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf9fw\" (UniqueName: \"kubernetes.io/projected/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-kube-api-access-hf9fw\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.348525 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.348574 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.348677 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.450272 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-config\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.450349 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.450377 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf9fw\" (UniqueName: \"kubernetes.io/projected/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-kube-api-access-hf9fw\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.450461 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.450496 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.450539 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.451319 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-config\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.451485 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.452294 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.452334 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.452710 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.477034 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf9fw\" (UniqueName: \"kubernetes.io/projected/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-kube-api-access-hf9fw\") pod \"dnsmasq-dns-89c5cd4d5-bbsh9\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:39 crc kubenswrapper[4580]: I0321 05:16:39.574668 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:40 crc kubenswrapper[4580]: I0321 05:16:40.286670 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-bbsh9"] Mar 21 05:16:40 crc kubenswrapper[4580]: I0321 05:16:40.868297 4580 generic.go:334] "Generic (PLEG): container finished" podID="9d663dd5-6647-4ea3-b927-cbe5b3d9edf2" containerID="d644423c4850932b8a427715707ec77b6c8db397a6132a9cfb01435364ce07ce" exitCode=0 Mar 21 05:16:40 crc kubenswrapper[4580]: I0321 05:16:40.868349 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" event={"ID":"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2","Type":"ContainerDied","Data":"d644423c4850932b8a427715707ec77b6c8db397a6132a9cfb01435364ce07ce"} Mar 21 05:16:40 crc kubenswrapper[4580]: I0321 05:16:40.869058 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" event={"ID":"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2","Type":"ContainerStarted","Data":"68466636260c8e64b587c41d470194bd7daf9bdfd69f4c343522ba817fe53290"} Mar 21 05:16:41 crc kubenswrapper[4580]: I0321 05:16:41.393596 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:16:41 crc kubenswrapper[4580]: I0321 05:16:41.509578 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 21 05:16:41 crc kubenswrapper[4580]: I0321 05:16:41.879000 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" event={"ID":"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2","Type":"ContainerStarted","Data":"71cebe7d6a47d8643893c0cb772804c24bb21b8b91d02e5c707b606715a4b9ec"} Mar 21 05:16:41 crc kubenswrapper[4580]: I0321 05:16:41.920837 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" podStartSLOduration=2.9208138420000003 podStartE2EDuration="2.920813842s" podCreationTimestamp="2026-03-21 05:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:16:41.907326184 +0000 UTC m=+1506.989909812" watchObservedRunningTime="2026-03-21 05:16:41.920813842 +0000 UTC m=+1507.003397470" Mar 21 05:16:42 crc kubenswrapper[4580]: I0321 05:16:42.064884 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:16:42 crc kubenswrapper[4580]: I0321 05:16:42.065160 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a99bb9ba-6aab-404f-9204-ede663d5478a" containerName="nova-api-log" containerID="cri-o://9d7d92edd71d04e4747bae1bda0613e5146b1d2fb4d9c756546c6e6d877ea8e1" gracePeriod=30 Mar 21 05:16:42 crc kubenswrapper[4580]: I0321 05:16:42.065298 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a99bb9ba-6aab-404f-9204-ede663d5478a" containerName="nova-api-api" containerID="cri-o://ca159f031f571279af5c9bdf3de0c475feba0b4875c8d41399ee4c2de895b22f" gracePeriod=30 Mar 21 05:16:42 crc kubenswrapper[4580]: I0321 05:16:42.447011 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:16:42 crc kubenswrapper[4580]: I0321 05:16:42.447370 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="proxy-httpd" containerID="cri-o://5b0caee2f00fca700e7409e04cd8b8edfc633add411f4111e7ae6af73bbf1007" gracePeriod=30 Mar 21 05:16:42 crc kubenswrapper[4580]: I0321 05:16:42.447447 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="ceilometer-notification-agent" containerID="cri-o://caa04f472ba51c65977ef6b911f99c99eaa39ad90fdc91393dec2d6cf5a43641" gracePeriod=30 Mar 21 05:16:42 crc kubenswrapper[4580]: I0321 05:16:42.447471 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="sg-core" containerID="cri-o://e1f39073542de59327a521c9b5471a16d8975c3a39f2d713395243dfca36553f" gracePeriod=30 Mar 21 05:16:42 crc kubenswrapper[4580]: I0321 05:16:42.447840 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="ceilometer-central-agent" containerID="cri-o://c8682db5f1111effe548a1609bfc5a39ca2f9fac8b81dab5af5edffa38454eef" gracePeriod=30 Mar 21 05:16:42 crc kubenswrapper[4580]: I0321 05:16:42.896495 4580 generic.go:334] "Generic (PLEG): container finished" podID="e3333784-13bd-4bc3-b52b-97899001daaf" containerID="5b0caee2f00fca700e7409e04cd8b8edfc633add411f4111e7ae6af73bbf1007" exitCode=0 Mar 21 05:16:42 crc kubenswrapper[4580]: I0321 05:16:42.896862 4580 generic.go:334] "Generic (PLEG): container finished" podID="e3333784-13bd-4bc3-b52b-97899001daaf" containerID="e1f39073542de59327a521c9b5471a16d8975c3a39f2d713395243dfca36553f" exitCode=2 Mar 21 05:16:42 crc kubenswrapper[4580]: I0321 05:16:42.896913 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3333784-13bd-4bc3-b52b-97899001daaf","Type":"ContainerDied","Data":"5b0caee2f00fca700e7409e04cd8b8edfc633add411f4111e7ae6af73bbf1007"} Mar 21 05:16:42 crc kubenswrapper[4580]: I0321 05:16:42.896949 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3333784-13bd-4bc3-b52b-97899001daaf","Type":"ContainerDied","Data":"e1f39073542de59327a521c9b5471a16d8975c3a39f2d713395243dfca36553f"} Mar 21 05:16:42 crc kubenswrapper[4580]: I0321 05:16:42.900284 4580 generic.go:334] "Generic (PLEG): container finished" podID="a99bb9ba-6aab-404f-9204-ede663d5478a" containerID="9d7d92edd71d04e4747bae1bda0613e5146b1d2fb4d9c756546c6e6d877ea8e1" exitCode=143 Mar 21 05:16:42 crc kubenswrapper[4580]: I0321 05:16:42.900508 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a99bb9ba-6aab-404f-9204-ede663d5478a","Type":"ContainerDied","Data":"9d7d92edd71d04e4747bae1bda0613e5146b1d2fb4d9c756546c6e6d877ea8e1"} Mar 21 05:16:42 crc kubenswrapper[4580]: I0321 05:16:42.900625 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:43 crc kubenswrapper[4580]: I0321 05:16:43.378267 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 21 05:16:43 crc kubenswrapper[4580]: I0321 05:16:43.912942 4580 generic.go:334] "Generic (PLEG): container finished" podID="e3333784-13bd-4bc3-b52b-97899001daaf" containerID="c8682db5f1111effe548a1609bfc5a39ca2f9fac8b81dab5af5edffa38454eef" exitCode=0 Mar 21 05:16:43 crc kubenswrapper[4580]: I0321 05:16:43.914153 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3333784-13bd-4bc3-b52b-97899001daaf","Type":"ContainerDied","Data":"c8682db5f1111effe548a1609bfc5a39ca2f9fac8b81dab5af5edffa38454eef"} Mar 21 05:16:44 crc kubenswrapper[4580]: I0321 05:16:44.148140 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-khbvh" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="registry-server" probeResult="failure" output=< Mar 21 05:16:44 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:16:44 crc kubenswrapper[4580]: > Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.415446 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-4z846"] Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.417074 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.420643 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.420691 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.484683 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-config-data\") pod \"nova-cell1-cell-mapping-4z846\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.484725 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv8n5\" (UniqueName: \"kubernetes.io/projected/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-kube-api-access-hv8n5\") pod \"nova-cell1-cell-mapping-4z846\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.484857 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-scripts\") pod \"nova-cell1-cell-mapping-4z846\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.484982 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4z846\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.485566 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4z846"] Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.587010 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-config-data\") pod \"nova-cell1-cell-mapping-4z846\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.587304 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv8n5\" (UniqueName: \"kubernetes.io/projected/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-kube-api-access-hv8n5\") pod \"nova-cell1-cell-mapping-4z846\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.587342 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-scripts\") pod \"nova-cell1-cell-mapping-4z846\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.588243 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4z846\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.611192 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4z846\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.612834 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-scripts\") pod \"nova-cell1-cell-mapping-4z846\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.614625 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv8n5\" (UniqueName: \"kubernetes.io/projected/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-kube-api-access-hv8n5\") pod \"nova-cell1-cell-mapping-4z846\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.615947 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-config-data\") pod \"nova-cell1-cell-mapping-4z846\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.731377 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.930389 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.960046 4580 generic.go:334] "Generic (PLEG): container finished" podID="a99bb9ba-6aab-404f-9204-ede663d5478a" containerID="ca159f031f571279af5c9bdf3de0c475feba0b4875c8d41399ee4c2de895b22f" exitCode=0 Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.960088 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a99bb9ba-6aab-404f-9204-ede663d5478a","Type":"ContainerDied","Data":"ca159f031f571279af5c9bdf3de0c475feba0b4875c8d41399ee4c2de895b22f"} Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.960114 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a99bb9ba-6aab-404f-9204-ede663d5478a","Type":"ContainerDied","Data":"c4ea18cd7a39f00b1dcfd8d73390f77bcc3ae37ddedd7aa7c695b48888b72772"} Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.960130 4580 scope.go:117] "RemoveContainer" containerID="ca159f031f571279af5c9bdf3de0c475feba0b4875c8d41399ee4c2de895b22f" Mar 21 05:16:45 crc kubenswrapper[4580]: I0321 05:16:45.960269 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.046925 4580 scope.go:117] "RemoveContainer" containerID="9d7d92edd71d04e4747bae1bda0613e5146b1d2fb4d9c756546c6e6d877ea8e1" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.095924 4580 scope.go:117] "RemoveContainer" containerID="ca159f031f571279af5c9bdf3de0c475feba0b4875c8d41399ee4c2de895b22f" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.099215 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a99bb9ba-6aab-404f-9204-ede663d5478a-config-data\") pod \"a99bb9ba-6aab-404f-9204-ede663d5478a\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.099265 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a99bb9ba-6aab-404f-9204-ede663d5478a-logs\") pod \"a99bb9ba-6aab-404f-9204-ede663d5478a\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " Mar 21 05:16:46 crc kubenswrapper[4580]: E0321 05:16:46.099357 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca159f031f571279af5c9bdf3de0c475feba0b4875c8d41399ee4c2de895b22f\": container with ID starting with ca159f031f571279af5c9bdf3de0c475feba0b4875c8d41399ee4c2de895b22f not found: ID does not exist" containerID="ca159f031f571279af5c9bdf3de0c475feba0b4875c8d41399ee4c2de895b22f" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.099384 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca159f031f571279af5c9bdf3de0c475feba0b4875c8d41399ee4c2de895b22f"} err="failed to get container status \"ca159f031f571279af5c9bdf3de0c475feba0b4875c8d41399ee4c2de895b22f\": rpc error: code = NotFound desc = could not find container \"ca159f031f571279af5c9bdf3de0c475feba0b4875c8d41399ee4c2de895b22f\": container with ID starting with ca159f031f571279af5c9bdf3de0c475feba0b4875c8d41399ee4c2de895b22f not found: ID does not exist" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.099403 4580 scope.go:117] "RemoveContainer" containerID="9d7d92edd71d04e4747bae1bda0613e5146b1d2fb4d9c756546c6e6d877ea8e1" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.099403 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a99bb9ba-6aab-404f-9204-ede663d5478a-combined-ca-bundle\") pod \"a99bb9ba-6aab-404f-9204-ede663d5478a\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.099447 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl4l7\" (UniqueName: \"kubernetes.io/projected/a99bb9ba-6aab-404f-9204-ede663d5478a-kube-api-access-tl4l7\") pod \"a99bb9ba-6aab-404f-9204-ede663d5478a\" (UID: \"a99bb9ba-6aab-404f-9204-ede663d5478a\") " Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.099748 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a99bb9ba-6aab-404f-9204-ede663d5478a-logs" (OuterVolumeSpecName: "logs") pod "a99bb9ba-6aab-404f-9204-ede663d5478a" (UID: "a99bb9ba-6aab-404f-9204-ede663d5478a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.100166 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a99bb9ba-6aab-404f-9204-ede663d5478a-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:46 crc kubenswrapper[4580]: E0321 05:16:46.100457 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d7d92edd71d04e4747bae1bda0613e5146b1d2fb4d9c756546c6e6d877ea8e1\": container with ID starting with 9d7d92edd71d04e4747bae1bda0613e5146b1d2fb4d9c756546c6e6d877ea8e1 not found: ID does not exist" containerID="9d7d92edd71d04e4747bae1bda0613e5146b1d2fb4d9c756546c6e6d877ea8e1" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.101720 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d7d92edd71d04e4747bae1bda0613e5146b1d2fb4d9c756546c6e6d877ea8e1"} err="failed to get container status \"9d7d92edd71d04e4747bae1bda0613e5146b1d2fb4d9c756546c6e6d877ea8e1\": rpc error: code = NotFound desc = could not find container \"9d7d92edd71d04e4747bae1bda0613e5146b1d2fb4d9c756546c6e6d877ea8e1\": container with ID starting with 9d7d92edd71d04e4747bae1bda0613e5146b1d2fb4d9c756546c6e6d877ea8e1 not found: ID does not exist" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.111049 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a99bb9ba-6aab-404f-9204-ede663d5478a-kube-api-access-tl4l7" (OuterVolumeSpecName: "kube-api-access-tl4l7") pod "a99bb9ba-6aab-404f-9204-ede663d5478a" (UID: "a99bb9ba-6aab-404f-9204-ede663d5478a"). InnerVolumeSpecName "kube-api-access-tl4l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.165927 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a99bb9ba-6aab-404f-9204-ede663d5478a-config-data" (OuterVolumeSpecName: "config-data") pod "a99bb9ba-6aab-404f-9204-ede663d5478a" (UID: "a99bb9ba-6aab-404f-9204-ede663d5478a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.171875 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a99bb9ba-6aab-404f-9204-ede663d5478a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a99bb9ba-6aab-404f-9204-ede663d5478a" (UID: "a99bb9ba-6aab-404f-9204-ede663d5478a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.202753 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a99bb9ba-6aab-404f-9204-ede663d5478a-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.203091 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a99bb9ba-6aab-404f-9204-ede663d5478a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.203132 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl4l7\" (UniqueName: \"kubernetes.io/projected/a99bb9ba-6aab-404f-9204-ede663d5478a-kube-api-access-tl4l7\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.359434 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.369961 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.389685 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 05:16:46 crc kubenswrapper[4580]: E0321 05:16:46.390207 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99bb9ba-6aab-404f-9204-ede663d5478a" containerName="nova-api-api" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.390226 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99bb9ba-6aab-404f-9204-ede663d5478a" containerName="nova-api-api" Mar 21 05:16:46 crc kubenswrapper[4580]: E0321 05:16:46.390252 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99bb9ba-6aab-404f-9204-ede663d5478a" containerName="nova-api-log" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.390260 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99bb9ba-6aab-404f-9204-ede663d5478a" containerName="nova-api-log" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.390508 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="a99bb9ba-6aab-404f-9204-ede663d5478a" containerName="nova-api-log" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.390542 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="a99bb9ba-6aab-404f-9204-ede663d5478a" containerName="nova-api-api" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.391938 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.397233 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.397402 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.400956 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.442101 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4z846"] Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.465220 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.509062 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-config-data\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.509214 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-public-tls-certs\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.509257 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kbjm\" (UniqueName: \"kubernetes.io/projected/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-kube-api-access-5kbjm\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.509303 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-logs\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.509410 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.509432 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.611437 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-public-tls-certs\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.611489 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kbjm\" (UniqueName: \"kubernetes.io/projected/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-kube-api-access-5kbjm\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.611524 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-logs\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.611580 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.611599 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.611668 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-config-data\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.612752 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-logs\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.623447 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-config-data\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.631336 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.631370 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.631569 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kbjm\" (UniqueName: \"kubernetes.io/projected/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-kube-api-access-5kbjm\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.632131 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-public-tls-certs\") pod \"nova-api-0\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.909565 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.987983 4580 generic.go:334] "Generic (PLEG): container finished" podID="e3333784-13bd-4bc3-b52b-97899001daaf" containerID="caa04f472ba51c65977ef6b911f99c99eaa39ad90fdc91393dec2d6cf5a43641" exitCode=0 Mar 21 05:16:46 crc kubenswrapper[4580]: I0321 05:16:46.988042 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3333784-13bd-4bc3-b52b-97899001daaf","Type":"ContainerDied","Data":"caa04f472ba51c65977ef6b911f99c99eaa39ad90fdc91393dec2d6cf5a43641"} Mar 21 05:16:47 crc kubenswrapper[4580]: I0321 05:16:47.011319 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4z846" event={"ID":"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd","Type":"ContainerStarted","Data":"64d7ac80d9e70fa5f18bc4e5a09f88bf68eb7b9a6790c1dd7f0cbc5d458b689d"} Mar 21 05:16:47 crc kubenswrapper[4580]: I0321 05:16:47.011370 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4z846" event={"ID":"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd","Type":"ContainerStarted","Data":"75e98d7aabb24719a74b04c2742b7a68b4fc8133abfe329349e558e35c6f03be"} Mar 21 05:16:47 crc kubenswrapper[4580]: I0321 05:16:47.039626 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-4z846" podStartSLOduration=2.039607118 podStartE2EDuration="2.039607118s" podCreationTimestamp="2026-03-21 05:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:16:47.037337064 +0000 UTC m=+1512.119920692" watchObservedRunningTime="2026-03-21 05:16:47.039607118 +0000 UTC m=+1512.122190746" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.265499 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.432401 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3333784-13bd-4bc3-b52b-97899001daaf-log-httpd\") pod \"e3333784-13bd-4bc3-b52b-97899001daaf\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.432535 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-sg-core-conf-yaml\") pod \"e3333784-13bd-4bc3-b52b-97899001daaf\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.432636 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-scripts\") pod \"e3333784-13bd-4bc3-b52b-97899001daaf\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.432682 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-config-data\") pod \"e3333784-13bd-4bc3-b52b-97899001daaf\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.432709 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2gt8\" (UniqueName: \"kubernetes.io/projected/e3333784-13bd-4bc3-b52b-97899001daaf-kube-api-access-m2gt8\") pod \"e3333784-13bd-4bc3-b52b-97899001daaf\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.432849 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3333784-13bd-4bc3-b52b-97899001daaf-run-httpd\") pod \"e3333784-13bd-4bc3-b52b-97899001daaf\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.432925 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-combined-ca-bundle\") pod \"e3333784-13bd-4bc3-b52b-97899001daaf\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.432970 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-ceilometer-tls-certs\") pod \"e3333784-13bd-4bc3-b52b-97899001daaf\" (UID: \"e3333784-13bd-4bc3-b52b-97899001daaf\") " Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.433090 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3333784-13bd-4bc3-b52b-97899001daaf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e3333784-13bd-4bc3-b52b-97899001daaf" (UID: "e3333784-13bd-4bc3-b52b-97899001daaf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.433303 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3333784-13bd-4bc3-b52b-97899001daaf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e3333784-13bd-4bc3-b52b-97899001daaf" (UID: "e3333784-13bd-4bc3-b52b-97899001daaf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.433715 4580 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3333784-13bd-4bc3-b52b-97899001daaf-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.433732 4580 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3333784-13bd-4bc3-b52b-97899001daaf-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.442498 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-scripts" (OuterVolumeSpecName: "scripts") pod "e3333784-13bd-4bc3-b52b-97899001daaf" (UID: "e3333784-13bd-4bc3-b52b-97899001daaf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.457940 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3333784-13bd-4bc3-b52b-97899001daaf-kube-api-access-m2gt8" (OuterVolumeSpecName: "kube-api-access-m2gt8") pod "e3333784-13bd-4bc3-b52b-97899001daaf" (UID: "e3333784-13bd-4bc3-b52b-97899001daaf"). InnerVolumeSpecName "kube-api-access-m2gt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.498800 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e3333784-13bd-4bc3-b52b-97899001daaf" (UID: "e3333784-13bd-4bc3-b52b-97899001daaf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.516403 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e3333784-13bd-4bc3-b52b-97899001daaf" (UID: "e3333784-13bd-4bc3-b52b-97899001daaf"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.535646 4580 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.535675 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.535684 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2gt8\" (UniqueName: \"kubernetes.io/projected/e3333784-13bd-4bc3-b52b-97899001daaf-kube-api-access-m2gt8\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.535693 4580 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.587948 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3333784-13bd-4bc3-b52b-97899001daaf" (UID: "e3333784-13bd-4bc3-b52b-97899001daaf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.606501 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-config-data" (OuterVolumeSpecName: "config-data") pod "e3333784-13bd-4bc3-b52b-97899001daaf" (UID: "e3333784-13bd-4bc3-b52b-97899001daaf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.628178 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a99bb9ba-6aab-404f-9204-ede663d5478a" path="/var/lib/kubelet/pods/a99bb9ba-6aab-404f-9204-ede663d5478a/volumes" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.638632 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:47.638666 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3333784-13bd-4bc3-b52b-97899001daaf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.024102 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3333784-13bd-4bc3-b52b-97899001daaf","Type":"ContainerDied","Data":"ff23de3c89053ee9ed2548541eef5a62f4ff6d1cb1e54dbd1749c4af8362766f"} Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.024416 4580 scope.go:117] "RemoveContainer" containerID="5b0caee2f00fca700e7409e04cd8b8edfc633add411f4111e7ae6af73bbf1007" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.024165 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.056484 4580 scope.go:117] "RemoveContainer" containerID="e1f39073542de59327a521c9b5471a16d8975c3a39f2d713395243dfca36553f" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.056565 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.068384 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.116015 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:16:48 crc kubenswrapper[4580]: E0321 05:16:48.116547 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="ceilometer-central-agent" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.116562 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="ceilometer-central-agent" Mar 21 05:16:48 crc kubenswrapper[4580]: E0321 05:16:48.116600 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="ceilometer-notification-agent" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.116608 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="ceilometer-notification-agent" Mar 21 05:16:48 crc kubenswrapper[4580]: E0321 05:16:48.116621 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="sg-core" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.116629 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="sg-core" Mar 21 05:16:48 crc kubenswrapper[4580]: E0321 05:16:48.116642 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="proxy-httpd" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.116648 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="proxy-httpd" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.116881 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="sg-core" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.116893 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="ceilometer-central-agent" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.116919 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="proxy-httpd" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.116930 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" containerName="ceilometer-notification-agent" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.118793 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.123396 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.123891 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.126530 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.146059 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.147714 4580 scope.go:117] "RemoveContainer" containerID="caa04f472ba51c65977ef6b911f99c99eaa39ad90fdc91393dec2d6cf5a43641" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.237097 4580 scope.go:117] "RemoveContainer" containerID="c8682db5f1111effe548a1609bfc5a39ca2f9fac8b81dab5af5edffa38454eef" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.255520 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-scripts\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.255590 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-run-httpd\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.255611 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-config-data\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.255664 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp7rt\" (UniqueName: \"kubernetes.io/projected/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-kube-api-access-jp7rt\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.255701 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.255752 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.255823 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-log-httpd\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.255853 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.359973 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.360127 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-scripts\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.360193 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-run-httpd\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.360213 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-config-data\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.360275 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp7rt\" (UniqueName: \"kubernetes.io/projected/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-kube-api-access-jp7rt\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.360313 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.361293 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.361594 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-run-httpd\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.361708 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-log-httpd\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.362010 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-log-httpd\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.369033 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.372274 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-config-data\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.373983 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-scripts\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.385351 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.391336 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp7rt\" (UniqueName: \"kubernetes.io/projected/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-kube-api-access-jp7rt\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.397917 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c77c9b9f-3e73-4cef-9e10-39bfef8357b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c77c9b9f-3e73-4cef-9e10-39bfef8357b5\") " pod="openstack/ceilometer-0" Mar 21 05:16:48 crc kubenswrapper[4580]: I0321 05:16:48.437861 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:16:49 crc kubenswrapper[4580]: I0321 05:16:49.100324 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:16:49 crc kubenswrapper[4580]: I0321 05:16:49.200352 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:16:49 crc kubenswrapper[4580]: W0321 05:16:49.221835 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc77c9b9f_3e73_4cef_9e10_39bfef8357b5.slice/crio-496d17e9cfd6a4d45a3c68170fdc487277d5f9867383e12d7712291654b66f62 WatchSource:0}: Error finding container 496d17e9cfd6a4d45a3c68170fdc487277d5f9867383e12d7712291654b66f62: Status 404 returned error can't find the container with id 496d17e9cfd6a4d45a3c68170fdc487277d5f9867383e12d7712291654b66f62 Mar 21 05:16:49 crc kubenswrapper[4580]: I0321 05:16:49.576950 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:16:49 crc kubenswrapper[4580]: I0321 05:16:49.633832 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3333784-13bd-4bc3-b52b-97899001daaf" path="/var/lib/kubelet/pods/e3333784-13bd-4bc3-b52b-97899001daaf/volumes" Mar 21 05:16:49 crc kubenswrapper[4580]: I0321 05:16:49.733029 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-z5g44"] Mar 21 05:16:49 crc kubenswrapper[4580]: I0321 05:16:49.733643 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-z5g44" podUID="e5fbadc4-b849-4ff9-b723-acc959e19b70" containerName="dnsmasq-dns" containerID="cri-o://53a50eb9b4500ab1c51da206c75fa2e7ac479a687c959f3fd455e65baaa2a9a8" gracePeriod=10 Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.067118 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce2547ac-9d8b-4542-b5d0-c76fd27e3442","Type":"ContainerStarted","Data":"b6705a71302f93ecd1668fdc3a12e92e360e7958941981cb86cbe33fd5aa54c3"} Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.067176 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce2547ac-9d8b-4542-b5d0-c76fd27e3442","Type":"ContainerStarted","Data":"731d3dc76040af2cf0e22089c23c8414fa451c0a6cc354c1c3fcb83f8aaed9cf"} Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.067189 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce2547ac-9d8b-4542-b5d0-c76fd27e3442","Type":"ContainerStarted","Data":"e2b719f78af352626dc2c6568f14ee45e9a2a9bafffbf859870172f53ae19797"} Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.071213 4580 generic.go:334] "Generic (PLEG): container finished" podID="e5fbadc4-b849-4ff9-b723-acc959e19b70" containerID="53a50eb9b4500ab1c51da206c75fa2e7ac479a687c959f3fd455e65baaa2a9a8" exitCode=0 Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.071263 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-z5g44" event={"ID":"e5fbadc4-b849-4ff9-b723-acc959e19b70","Type":"ContainerDied","Data":"53a50eb9b4500ab1c51da206c75fa2e7ac479a687c959f3fd455e65baaa2a9a8"} Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.072895 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c77c9b9f-3e73-4cef-9e10-39bfef8357b5","Type":"ContainerStarted","Data":"496d17e9cfd6a4d45a3c68170fdc487277d5f9867383e12d7712291654b66f62"} Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.097908 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.097882576 podStartE2EDuration="4.097882576s" podCreationTimestamp="2026-03-21 05:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:16:50.09303315 +0000 UTC m=+1515.175616788" watchObservedRunningTime="2026-03-21 05:16:50.097882576 +0000 UTC m=+1515.180466204" Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.319897 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.444889 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-config\") pod \"e5fbadc4-b849-4ff9-b723-acc959e19b70\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.444954 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-dns-svc\") pod \"e5fbadc4-b849-4ff9-b723-acc959e19b70\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.445028 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-ovsdbserver-nb\") pod \"e5fbadc4-b849-4ff9-b723-acc959e19b70\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.445198 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhb4t\" (UniqueName: \"kubernetes.io/projected/e5fbadc4-b849-4ff9-b723-acc959e19b70-kube-api-access-zhb4t\") pod \"e5fbadc4-b849-4ff9-b723-acc959e19b70\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.445287 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-ovsdbserver-sb\") pod \"e5fbadc4-b849-4ff9-b723-acc959e19b70\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.445313 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-dns-swift-storage-0\") pod \"e5fbadc4-b849-4ff9-b723-acc959e19b70\" (UID: \"e5fbadc4-b849-4ff9-b723-acc959e19b70\") " Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.459287 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5fbadc4-b849-4ff9-b723-acc959e19b70-kube-api-access-zhb4t" (OuterVolumeSpecName: "kube-api-access-zhb4t") pod "e5fbadc4-b849-4ff9-b723-acc959e19b70" (UID: "e5fbadc4-b849-4ff9-b723-acc959e19b70"). InnerVolumeSpecName "kube-api-access-zhb4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.545467 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e5fbadc4-b849-4ff9-b723-acc959e19b70" (UID: "e5fbadc4-b849-4ff9-b723-acc959e19b70"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.548289 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.548315 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhb4t\" (UniqueName: \"kubernetes.io/projected/e5fbadc4-b849-4ff9-b723-acc959e19b70-kube-api-access-zhb4t\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.554051 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e5fbadc4-b849-4ff9-b723-acc959e19b70" (UID: "e5fbadc4-b849-4ff9-b723-acc959e19b70"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.562643 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e5fbadc4-b849-4ff9-b723-acc959e19b70" (UID: "e5fbadc4-b849-4ff9-b723-acc959e19b70"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.577219 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-config" (OuterVolumeSpecName: "config") pod "e5fbadc4-b849-4ff9-b723-acc959e19b70" (UID: "e5fbadc4-b849-4ff9-b723-acc959e19b70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.578371 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e5fbadc4-b849-4ff9-b723-acc959e19b70" (UID: "e5fbadc4-b849-4ff9-b723-acc959e19b70"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.650102 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.650226 4580 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.650241 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:50 crc kubenswrapper[4580]: I0321 05:16:50.650252 4580 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5fbadc4-b849-4ff9-b723-acc959e19b70-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.083861 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c77c9b9f-3e73-4cef-9e10-39bfef8357b5","Type":"ContainerStarted","Data":"e0d7b034064341f4b12cd2967d78e65968c48b360ae1faa41ae1e583de072c36"} Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.085758 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-z5g44" event={"ID":"e5fbadc4-b849-4ff9-b723-acc959e19b70","Type":"ContainerDied","Data":"9e844b45389fd9d9d67a850b8efec6b839cfa275bd1327ae11655ecaa94a000f"} Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.085814 4580 scope.go:117] "RemoveContainer" containerID="53a50eb9b4500ab1c51da206c75fa2e7ac479a687c959f3fd455e65baaa2a9a8" Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.085879 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-z5g44" Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.121111 4580 scope.go:117] "RemoveContainer" containerID="205210abd5ae6b11199c300f7f528980e6136defb5c9afe2874a96e3b1a788a6" Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.128754 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-z5g44"] Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.139290 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-z5g44"] Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.395543 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.395625 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.396425 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"1da18f5c3b92ba4606247cf9c331bf50366459fc361b2b8533adb37a81f66e54"} pod="openstack/horizon-587cfc8688-265kc" containerMessage="Container horizon failed startup probe, will be restarted" Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.396459 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" containerID="cri-o://1da18f5c3b92ba4606247cf9c331bf50366459fc361b2b8533adb37a81f66e54" gracePeriod=30 Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.507286 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.507370 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.508263 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"8dec5c044edc1705690a7eafd8a1c1f2fb3f54df8a14dee933c7e1786ce58f44"} pod="openstack/horizon-67655f8b6-mbx6n" containerMessage="Container horizon failed startup probe, will be restarted" Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.508303 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" containerID="cri-o://8dec5c044edc1705690a7eafd8a1c1f2fb3f54df8a14dee933c7e1786ce58f44" gracePeriod=30 Mar 21 05:16:51 crc kubenswrapper[4580]: I0321 05:16:51.630896 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5fbadc4-b849-4ff9-b723-acc959e19b70" path="/var/lib/kubelet/pods/e5fbadc4-b849-4ff9-b723-acc959e19b70/volumes" Mar 21 05:16:53 crc kubenswrapper[4580]: I0321 05:16:53.112618 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c77c9b9f-3e73-4cef-9e10-39bfef8357b5","Type":"ContainerStarted","Data":"9a26a60ab6f683f687fcab4f05dadd099ee5b4a9d2bd51675dea0bbbc62c57dc"} Mar 21 05:16:53 crc kubenswrapper[4580]: I0321 05:16:53.113351 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c77c9b9f-3e73-4cef-9e10-39bfef8357b5","Type":"ContainerStarted","Data":"338fe955bccd7500e0f27b2205b752783d281f7d23ca023718b8cc62ba518495"} Mar 21 05:16:54 crc kubenswrapper[4580]: I0321 05:16:54.145207 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-khbvh" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="registry-server" probeResult="failure" output=< Mar 21 05:16:54 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:16:54 crc kubenswrapper[4580]: > Mar 21 05:16:55 crc kubenswrapper[4580]: I0321 05:16:55.132214 4580 generic.go:334] "Generic (PLEG): container finished" podID="0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd" containerID="64d7ac80d9e70fa5f18bc4e5a09f88bf68eb7b9a6790c1dd7f0cbc5d458b689d" exitCode=0 Mar 21 05:16:55 crc kubenswrapper[4580]: I0321 05:16:55.132560 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4z846" event={"ID":"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd","Type":"ContainerDied","Data":"64d7ac80d9e70fa5f18bc4e5a09f88bf68eb7b9a6790c1dd7f0cbc5d458b689d"} Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.148102 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c77c9b9f-3e73-4cef-9e10-39bfef8357b5","Type":"ContainerStarted","Data":"72845c6c335fc57bc1f6d2dcbb70ad42893437fc28b2a8fc2f72a64c182e587b"} Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.176602 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.395510966 podStartE2EDuration="8.176580841s" podCreationTimestamp="2026-03-21 05:16:48 +0000 UTC" firstStartedPulling="2026-03-21 05:16:49.231232142 +0000 UTC m=+1514.313815770" lastFinishedPulling="2026-03-21 05:16:55.012302017 +0000 UTC m=+1520.094885645" observedRunningTime="2026-03-21 05:16:56.170721506 +0000 UTC m=+1521.253305144" watchObservedRunningTime="2026-03-21 05:16:56.176580841 +0000 UTC m=+1521.259164469" Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.485613 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.572486 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-scripts\") pod \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.572593 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-config-data\") pod \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.572629 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-combined-ca-bundle\") pod \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.572664 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv8n5\" (UniqueName: \"kubernetes.io/projected/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-kube-api-access-hv8n5\") pod \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\" (UID: \"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd\") " Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.578898 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-scripts" (OuterVolumeSpecName: "scripts") pod "0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd" (UID: "0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.581986 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-kube-api-access-hv8n5" (OuterVolumeSpecName: "kube-api-access-hv8n5") pod "0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd" (UID: "0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd"). InnerVolumeSpecName "kube-api-access-hv8n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.607021 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd" (UID: "0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.612082 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-config-data" (OuterVolumeSpecName: "config-data") pod "0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd" (UID: "0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.675908 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.675946 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.675962 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.675980 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv8n5\" (UniqueName: \"kubernetes.io/projected/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd-kube-api-access-hv8n5\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.909803 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 05:16:56 crc kubenswrapper[4580]: I0321 05:16:56.909852 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 05:16:57 crc kubenswrapper[4580]: I0321 05:16:57.159491 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4z846" Mar 21 05:16:57 crc kubenswrapper[4580]: I0321 05:16:57.163899 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4z846" event={"ID":"0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd","Type":"ContainerDied","Data":"75e98d7aabb24719a74b04c2742b7a68b4fc8133abfe329349e558e35c6f03be"} Mar 21 05:16:57 crc kubenswrapper[4580]: I0321 05:16:57.163966 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75e98d7aabb24719a74b04c2742b7a68b4fc8133abfe329349e558e35c6f03be" Mar 21 05:16:57 crc kubenswrapper[4580]: I0321 05:16:57.163993 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 05:16:57 crc kubenswrapper[4580]: I0321 05:16:57.326437 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:16:57 crc kubenswrapper[4580]: I0321 05:16:57.326668 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ce2547ac-9d8b-4542-b5d0-c76fd27e3442" containerName="nova-api-log" containerID="cri-o://731d3dc76040af2cf0e22089c23c8414fa451c0a6cc354c1c3fcb83f8aaed9cf" gracePeriod=30 Mar 21 05:16:57 crc kubenswrapper[4580]: I0321 05:16:57.326733 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ce2547ac-9d8b-4542-b5d0-c76fd27e3442" containerName="nova-api-api" containerID="cri-o://b6705a71302f93ecd1668fdc3a12e92e360e7958941981cb86cbe33fd5aa54c3" gracePeriod=30 Mar 21 05:16:57 crc kubenswrapper[4580]: I0321 05:16:57.336426 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ce2547ac-9d8b-4542-b5d0-c76fd27e3442" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": EOF" Mar 21 05:16:57 crc kubenswrapper[4580]: I0321 05:16:57.355606 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ce2547ac-9d8b-4542-b5d0-c76fd27e3442" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": EOF" Mar 21 05:16:57 crc kubenswrapper[4580]: I0321 05:16:57.360938 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:16:57 crc kubenswrapper[4580]: I0321 05:16:57.361150 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c" containerName="nova-scheduler-scheduler" containerID="cri-o://7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb" gracePeriod=30 Mar 21 05:16:57 crc kubenswrapper[4580]: I0321 05:16:57.386106 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:16:57 crc kubenswrapper[4580]: I0321 05:16:57.386335 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="581fa0f6-632d-4054-b169-0aa596f21ee2" containerName="nova-metadata-log" containerID="cri-o://9e0feabf215b5d2e86b8ff1fdce0317ad5f4dcd5a2a8e7db8ec640dadb754b9e" gracePeriod=30 Mar 21 05:16:57 crc kubenswrapper[4580]: I0321 05:16:57.386429 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="581fa0f6-632d-4054-b169-0aa596f21ee2" containerName="nova-metadata-metadata" containerID="cri-o://34f8d2685657be65d92451ad9b924288cba4bdb608f7819146a0cc4763321bdc" gracePeriod=30 Mar 21 05:16:57 crc kubenswrapper[4580]: E0321 05:16:57.951151 4580 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 05:16:57 crc kubenswrapper[4580]: E0321 05:16:57.952837 4580 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 05:16:57 crc kubenswrapper[4580]: E0321 05:16:57.954389 4580 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 05:16:57 crc kubenswrapper[4580]: E0321 05:16:57.954444 4580 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c" containerName="nova-scheduler-scheduler" Mar 21 05:16:58 crc kubenswrapper[4580]: I0321 05:16:58.170049 4580 generic.go:334] "Generic (PLEG): container finished" podID="581fa0f6-632d-4054-b169-0aa596f21ee2" containerID="9e0feabf215b5d2e86b8ff1fdce0317ad5f4dcd5a2a8e7db8ec640dadb754b9e" exitCode=143 Mar 21 05:16:58 crc kubenswrapper[4580]: I0321 05:16:58.170126 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"581fa0f6-632d-4054-b169-0aa596f21ee2","Type":"ContainerDied","Data":"9e0feabf215b5d2e86b8ff1fdce0317ad5f4dcd5a2a8e7db8ec640dadb754b9e"} Mar 21 05:16:58 crc kubenswrapper[4580]: I0321 05:16:58.171640 4580 generic.go:334] "Generic (PLEG): container finished" podID="ce2547ac-9d8b-4542-b5d0-c76fd27e3442" containerID="731d3dc76040af2cf0e22089c23c8414fa451c0a6cc354c1c3fcb83f8aaed9cf" exitCode=143 Mar 21 05:16:58 crc kubenswrapper[4580]: I0321 05:16:58.172596 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce2547ac-9d8b-4542-b5d0-c76fd27e3442","Type":"ContainerDied","Data":"731d3dc76040af2cf0e22089c23c8414fa451c0a6cc354c1c3fcb83f8aaed9cf"} Mar 21 05:16:58 crc kubenswrapper[4580]: I0321 05:16:58.625614 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:16:58 crc kubenswrapper[4580]: I0321 05:16:58.821283 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-config-data\") pod \"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c\" (UID: \"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c\") " Mar 21 05:16:58 crc kubenswrapper[4580]: I0321 05:16:58.821429 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swnhm\" (UniqueName: \"kubernetes.io/projected/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-kube-api-access-swnhm\") pod \"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c\" (UID: \"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c\") " Mar 21 05:16:58 crc kubenswrapper[4580]: I0321 05:16:58.821473 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-combined-ca-bundle\") pod \"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c\" (UID: \"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c\") " Mar 21 05:16:58 crc kubenswrapper[4580]: I0321 05:16:58.830043 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-kube-api-access-swnhm" (OuterVolumeSpecName: "kube-api-access-swnhm") pod "2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c" (UID: "2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c"). InnerVolumeSpecName "kube-api-access-swnhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:58 crc kubenswrapper[4580]: I0321 05:16:58.872042 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-config-data" (OuterVolumeSpecName: "config-data") pod "2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c" (UID: "2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:58 crc kubenswrapper[4580]: I0321 05:16:58.872511 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c" (UID: "2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:58 crc kubenswrapper[4580]: I0321 05:16:58.923325 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swnhm\" (UniqueName: \"kubernetes.io/projected/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-kube-api-access-swnhm\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:58 crc kubenswrapper[4580]: I0321 05:16:58.923585 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:58 crc kubenswrapper[4580]: I0321 05:16:58.923666 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.187129 4580 generic.go:334] "Generic (PLEG): container finished" podID="2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c" containerID="7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb" exitCode=0 Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.187176 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c","Type":"ContainerDied","Data":"7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb"} Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.187204 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c","Type":"ContainerDied","Data":"41e2e6829e17430fd7674d99fe9a0d7ef416ae9f83bc7708cfec948e68a31aeb"} Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.187235 4580 scope.go:117] "RemoveContainer" containerID="7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.187387 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.230605 4580 scope.go:117] "RemoveContainer" containerID="7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb" Mar 21 05:16:59 crc kubenswrapper[4580]: E0321 05:16:59.230966 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb\": container with ID starting with 7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb not found: ID does not exist" containerID="7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.230998 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb"} err="failed to get container status \"7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb\": rpc error: code = NotFound desc = could not find container \"7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb\": container with ID starting with 7c9da183e103a8ebc75e46d35b8a1afeaf602e0f2c3e7215d401c14f821955eb not found: ID does not exist" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.233307 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.257470 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.270257 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:16:59 crc kubenswrapper[4580]: E0321 05:16:59.270889 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fbadc4-b849-4ff9-b723-acc959e19b70" containerName="dnsmasq-dns" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.270910 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fbadc4-b849-4ff9-b723-acc959e19b70" containerName="dnsmasq-dns" Mar 21 05:16:59 crc kubenswrapper[4580]: E0321 05:16:59.270919 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c" containerName="nova-scheduler-scheduler" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.270926 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c" containerName="nova-scheduler-scheduler" Mar 21 05:16:59 crc kubenswrapper[4580]: E0321 05:16:59.270953 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fbadc4-b849-4ff9-b723-acc959e19b70" containerName="init" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.270960 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fbadc4-b849-4ff9-b723-acc959e19b70" containerName="init" Mar 21 05:16:59 crc kubenswrapper[4580]: E0321 05:16:59.270990 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd" containerName="nova-manage" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.270997 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd" containerName="nova-manage" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.271202 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd" containerName="nova-manage" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.271216 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c" containerName="nova-scheduler-scheduler" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.271239 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5fbadc4-b849-4ff9-b723-acc959e19b70" containerName="dnsmasq-dns" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.272329 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.274835 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.299898 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.331502 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-528d4\" (UniqueName: \"kubernetes.io/projected/6b628f21-06f6-4838-805f-b0d25851ac35-kube-api-access-528d4\") pod \"nova-scheduler-0\" (UID: \"6b628f21-06f6-4838-805f-b0d25851ac35\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.331726 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b628f21-06f6-4838-805f-b0d25851ac35-config-data\") pod \"nova-scheduler-0\" (UID: \"6b628f21-06f6-4838-805f-b0d25851ac35\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.331829 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b628f21-06f6-4838-805f-b0d25851ac35-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6b628f21-06f6-4838-805f-b0d25851ac35\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.434115 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b628f21-06f6-4838-805f-b0d25851ac35-config-data\") pod \"nova-scheduler-0\" (UID: \"6b628f21-06f6-4838-805f-b0d25851ac35\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.434367 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b628f21-06f6-4838-805f-b0d25851ac35-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6b628f21-06f6-4838-805f-b0d25851ac35\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.434626 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-528d4\" (UniqueName: \"kubernetes.io/projected/6b628f21-06f6-4838-805f-b0d25851ac35-kube-api-access-528d4\") pod \"nova-scheduler-0\" (UID: \"6b628f21-06f6-4838-805f-b0d25851ac35\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.438726 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b628f21-06f6-4838-805f-b0d25851ac35-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6b628f21-06f6-4838-805f-b0d25851ac35\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.451166 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b628f21-06f6-4838-805f-b0d25851ac35-config-data\") pod \"nova-scheduler-0\" (UID: \"6b628f21-06f6-4838-805f-b0d25851ac35\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.452617 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-528d4\" (UniqueName: \"kubernetes.io/projected/6b628f21-06f6-4838-805f-b0d25851ac35-kube-api-access-528d4\") pod \"nova-scheduler-0\" (UID: \"6b628f21-06f6-4838-805f-b0d25851ac35\") " pod="openstack/nova-scheduler-0" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.589666 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:16:59 crc kubenswrapper[4580]: I0321 05:16:59.628876 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c" path="/var/lib/kubelet/pods/2c8ea3ec-ea9b-46fc-b2a6-b6130d01699c/volumes" Mar 21 05:17:00 crc kubenswrapper[4580]: I0321 05:17:00.102744 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:17:00 crc kubenswrapper[4580]: I0321 05:17:00.201291 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6b628f21-06f6-4838-805f-b0d25851ac35","Type":"ContainerStarted","Data":"44068b657100a9f1c08321dc7a33f543cd92c336fca30b1c2872d033449ecea5"} Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.215240 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6b628f21-06f6-4838-805f-b0d25851ac35","Type":"ContainerStarted","Data":"a8003e13afb251caab51ae6e7f1f79eab933e82ee84b523fa275397b84120cd6"} Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.218940 4580 generic.go:334] "Generic (PLEG): container finished" podID="581fa0f6-632d-4054-b169-0aa596f21ee2" containerID="34f8d2685657be65d92451ad9b924288cba4bdb608f7819146a0cc4763321bdc" exitCode=0 Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.218981 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"581fa0f6-632d-4054-b169-0aa596f21ee2","Type":"ContainerDied","Data":"34f8d2685657be65d92451ad9b924288cba4bdb608f7819146a0cc4763321bdc"} Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.236912 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.236891665 podStartE2EDuration="2.236891665s" podCreationTimestamp="2026-03-21 05:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:17:01.233247813 +0000 UTC m=+1526.315831451" watchObservedRunningTime="2026-03-21 05:17:01.236891665 +0000 UTC m=+1526.319475303" Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.335869 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.481833 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-combined-ca-bundle\") pod \"581fa0f6-632d-4054-b169-0aa596f21ee2\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.481968 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wsrh\" (UniqueName: \"kubernetes.io/projected/581fa0f6-632d-4054-b169-0aa596f21ee2-kube-api-access-6wsrh\") pod \"581fa0f6-632d-4054-b169-0aa596f21ee2\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.482016 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/581fa0f6-632d-4054-b169-0aa596f21ee2-logs\") pod \"581fa0f6-632d-4054-b169-0aa596f21ee2\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.482071 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-config-data\") pod \"581fa0f6-632d-4054-b169-0aa596f21ee2\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.482213 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-nova-metadata-tls-certs\") pod \"581fa0f6-632d-4054-b169-0aa596f21ee2\" (UID: \"581fa0f6-632d-4054-b169-0aa596f21ee2\") " Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.485495 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/581fa0f6-632d-4054-b169-0aa596f21ee2-logs" (OuterVolumeSpecName: "logs") pod "581fa0f6-632d-4054-b169-0aa596f21ee2" (UID: "581fa0f6-632d-4054-b169-0aa596f21ee2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.495979 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/581fa0f6-632d-4054-b169-0aa596f21ee2-kube-api-access-6wsrh" (OuterVolumeSpecName: "kube-api-access-6wsrh") pod "581fa0f6-632d-4054-b169-0aa596f21ee2" (UID: "581fa0f6-632d-4054-b169-0aa596f21ee2"). InnerVolumeSpecName "kube-api-access-6wsrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.515250 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "581fa0f6-632d-4054-b169-0aa596f21ee2" (UID: "581fa0f6-632d-4054-b169-0aa596f21ee2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.528956 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-config-data" (OuterVolumeSpecName: "config-data") pod "581fa0f6-632d-4054-b169-0aa596f21ee2" (UID: "581fa0f6-632d-4054-b169-0aa596f21ee2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.584345 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wsrh\" (UniqueName: \"kubernetes.io/projected/581fa0f6-632d-4054-b169-0aa596f21ee2-kube-api-access-6wsrh\") on node \"crc\" DevicePath \"\"" Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.584521 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/581fa0f6-632d-4054-b169-0aa596f21ee2-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.584842 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.584923 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.589621 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "581fa0f6-632d-4054-b169-0aa596f21ee2" (UID: "581fa0f6-632d-4054-b169-0aa596f21ee2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:17:01 crc kubenswrapper[4580]: I0321 05:17:01.686993 4580 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/581fa0f6-632d-4054-b169-0aa596f21ee2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.232815 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.232872 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"581fa0f6-632d-4054-b169-0aa596f21ee2","Type":"ContainerDied","Data":"77ffecf8de8a5a2b8b0b0410a2b95a91eacc0b72a2d23a0985408f233d20acd4"} Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.232942 4580 scope.go:117] "RemoveContainer" containerID="34f8d2685657be65d92451ad9b924288cba4bdb608f7819146a0cc4763321bdc" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.264296 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.280184 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.293109 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:17:02 crc kubenswrapper[4580]: E0321 05:17:02.293538 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="581fa0f6-632d-4054-b169-0aa596f21ee2" containerName="nova-metadata-log" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.293554 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="581fa0f6-632d-4054-b169-0aa596f21ee2" containerName="nova-metadata-log" Mar 21 05:17:02 crc kubenswrapper[4580]: E0321 05:17:02.293566 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="581fa0f6-632d-4054-b169-0aa596f21ee2" containerName="nova-metadata-metadata" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.293572 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="581fa0f6-632d-4054-b169-0aa596f21ee2" containerName="nova-metadata-metadata" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.293749 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="581fa0f6-632d-4054-b169-0aa596f21ee2" containerName="nova-metadata-metadata" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.293772 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="581fa0f6-632d-4054-b169-0aa596f21ee2" containerName="nova-metadata-log" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.294738 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.296902 4580 scope.go:117] "RemoveContainer" containerID="9e0feabf215b5d2e86b8ff1fdce0317ad5f4dcd5a2a8e7db8ec640dadb754b9e" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.297399 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.297576 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.330459 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.406880 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/984d329d-aa14-46fb-9c9f-c5f9eb415f73-config-data\") pod \"nova-metadata-0\" (UID: \"984d329d-aa14-46fb-9c9f-c5f9eb415f73\") " pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.407109 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr969\" (UniqueName: \"kubernetes.io/projected/984d329d-aa14-46fb-9c9f-c5f9eb415f73-kube-api-access-dr969\") pod \"nova-metadata-0\" (UID: \"984d329d-aa14-46fb-9c9f-c5f9eb415f73\") " pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.407214 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/984d329d-aa14-46fb-9c9f-c5f9eb415f73-logs\") pod \"nova-metadata-0\" (UID: \"984d329d-aa14-46fb-9c9f-c5f9eb415f73\") " pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.407338 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/984d329d-aa14-46fb-9c9f-c5f9eb415f73-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"984d329d-aa14-46fb-9c9f-c5f9eb415f73\") " pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.407544 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984d329d-aa14-46fb-9c9f-c5f9eb415f73-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"984d329d-aa14-46fb-9c9f-c5f9eb415f73\") " pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.508364 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984d329d-aa14-46fb-9c9f-c5f9eb415f73-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"984d329d-aa14-46fb-9c9f-c5f9eb415f73\") " pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.508926 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/984d329d-aa14-46fb-9c9f-c5f9eb415f73-config-data\") pod \"nova-metadata-0\" (UID: \"984d329d-aa14-46fb-9c9f-c5f9eb415f73\") " pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.509495 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr969\" (UniqueName: \"kubernetes.io/projected/984d329d-aa14-46fb-9c9f-c5f9eb415f73-kube-api-access-dr969\") pod \"nova-metadata-0\" (UID: \"984d329d-aa14-46fb-9c9f-c5f9eb415f73\") " pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.509722 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/984d329d-aa14-46fb-9c9f-c5f9eb415f73-logs\") pod \"nova-metadata-0\" (UID: \"984d329d-aa14-46fb-9c9f-c5f9eb415f73\") " pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.509832 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/984d329d-aa14-46fb-9c9f-c5f9eb415f73-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"984d329d-aa14-46fb-9c9f-c5f9eb415f73\") " pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.510514 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/984d329d-aa14-46fb-9c9f-c5f9eb415f73-logs\") pod \"nova-metadata-0\" (UID: \"984d329d-aa14-46fb-9c9f-c5f9eb415f73\") " pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.513550 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984d329d-aa14-46fb-9c9f-c5f9eb415f73-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"984d329d-aa14-46fb-9c9f-c5f9eb415f73\") " pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.525863 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/984d329d-aa14-46fb-9c9f-c5f9eb415f73-config-data\") pod \"nova-metadata-0\" (UID: \"984d329d-aa14-46fb-9c9f-c5f9eb415f73\") " pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.526024 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/984d329d-aa14-46fb-9c9f-c5f9eb415f73-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"984d329d-aa14-46fb-9c9f-c5f9eb415f73\") " pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.528630 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr969\" (UniqueName: \"kubernetes.io/projected/984d329d-aa14-46fb-9c9f-c5f9eb415f73-kube-api-access-dr969\") pod \"nova-metadata-0\" (UID: \"984d329d-aa14-46fb-9c9f-c5f9eb415f73\") " pod="openstack/nova-metadata-0" Mar 21 05:17:02 crc kubenswrapper[4580]: I0321 05:17:02.625265 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:17:03 crc kubenswrapper[4580]: I0321 05:17:03.064531 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:17:03 crc kubenswrapper[4580]: I0321 05:17:03.257959 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"984d329d-aa14-46fb-9c9f-c5f9eb415f73","Type":"ContainerStarted","Data":"ab17b52fdc120b781612cb24500f0d01dcb80a2a9966d0aad7f71c3b3cc9f2fb"} Mar 21 05:17:03 crc kubenswrapper[4580]: I0321 05:17:03.628494 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="581fa0f6-632d-4054-b169-0aa596f21ee2" path="/var/lib/kubelet/pods/581fa0f6-632d-4054-b169-0aa596f21ee2/volumes" Mar 21 05:17:04 crc kubenswrapper[4580]: I0321 05:17:04.144855 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-khbvh" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="registry-server" probeResult="failure" output=< Mar 21 05:17:04 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:17:04 crc kubenswrapper[4580]: > Mar 21 05:17:04 crc kubenswrapper[4580]: I0321 05:17:04.267971 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"984d329d-aa14-46fb-9c9f-c5f9eb415f73","Type":"ContainerStarted","Data":"49c3bc5adabfe608b1d3cf1d316a4247de04b8c7764b7008e9b78f1b6b04bf5b"} Mar 21 05:17:04 crc kubenswrapper[4580]: I0321 05:17:04.268021 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"984d329d-aa14-46fb-9c9f-c5f9eb415f73","Type":"ContainerStarted","Data":"9c1b70078d5ca1dcd3fef2acd66eafb7dfc7a86c222a98805e9ad6eb52326f12"} Mar 21 05:17:04 crc kubenswrapper[4580]: I0321 05:17:04.285176 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.285157634 podStartE2EDuration="2.285157634s" podCreationTimestamp="2026-03-21 05:17:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:17:04.283442065 +0000 UTC m=+1529.366025713" watchObservedRunningTime="2026-03-21 05:17:04.285157634 +0000 UTC m=+1529.367741272" Mar 21 05:17:04 crc kubenswrapper[4580]: I0321 05:17:04.591970 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 21 05:17:04 crc kubenswrapper[4580]: I0321 05:17:04.910273 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 05:17:04 crc kubenswrapper[4580]: I0321 05:17:04.910321 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.252712 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.284119 4580 generic.go:334] "Generic (PLEG): container finished" podID="ce2547ac-9d8b-4542-b5d0-c76fd27e3442" containerID="b6705a71302f93ecd1668fdc3a12e92e360e7958941981cb86cbe33fd5aa54c3" exitCode=0 Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.284944 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce2547ac-9d8b-4542-b5d0-c76fd27e3442","Type":"ContainerDied","Data":"b6705a71302f93ecd1668fdc3a12e92e360e7958941981cb86cbe33fd5aa54c3"} Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.284982 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce2547ac-9d8b-4542-b5d0-c76fd27e3442","Type":"ContainerDied","Data":"e2b719f78af352626dc2c6568f14ee45e9a2a9bafffbf859870172f53ae19797"} Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.284986 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.285001 4580 scope.go:117] "RemoveContainer" containerID="b6705a71302f93ecd1668fdc3a12e92e360e7958941981cb86cbe33fd5aa54c3" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.322349 4580 scope.go:117] "RemoveContainer" containerID="731d3dc76040af2cf0e22089c23c8414fa451c0a6cc354c1c3fcb83f8aaed9cf" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.342436 4580 scope.go:117] "RemoveContainer" containerID="b6705a71302f93ecd1668fdc3a12e92e360e7958941981cb86cbe33fd5aa54c3" Mar 21 05:17:05 crc kubenswrapper[4580]: E0321 05:17:05.342838 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6705a71302f93ecd1668fdc3a12e92e360e7958941981cb86cbe33fd5aa54c3\": container with ID starting with b6705a71302f93ecd1668fdc3a12e92e360e7958941981cb86cbe33fd5aa54c3 not found: ID does not exist" containerID="b6705a71302f93ecd1668fdc3a12e92e360e7958941981cb86cbe33fd5aa54c3" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.342881 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6705a71302f93ecd1668fdc3a12e92e360e7958941981cb86cbe33fd5aa54c3"} err="failed to get container status \"b6705a71302f93ecd1668fdc3a12e92e360e7958941981cb86cbe33fd5aa54c3\": rpc error: code = NotFound desc = could not find container \"b6705a71302f93ecd1668fdc3a12e92e360e7958941981cb86cbe33fd5aa54c3\": container with ID starting with b6705a71302f93ecd1668fdc3a12e92e360e7958941981cb86cbe33fd5aa54c3 not found: ID does not exist" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.342906 4580 scope.go:117] "RemoveContainer" containerID="731d3dc76040af2cf0e22089c23c8414fa451c0a6cc354c1c3fcb83f8aaed9cf" Mar 21 05:17:05 crc kubenswrapper[4580]: E0321 05:17:05.343168 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"731d3dc76040af2cf0e22089c23c8414fa451c0a6cc354c1c3fcb83f8aaed9cf\": container with ID starting with 731d3dc76040af2cf0e22089c23c8414fa451c0a6cc354c1c3fcb83f8aaed9cf not found: ID does not exist" containerID="731d3dc76040af2cf0e22089c23c8414fa451c0a6cc354c1c3fcb83f8aaed9cf" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.343197 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"731d3dc76040af2cf0e22089c23c8414fa451c0a6cc354c1c3fcb83f8aaed9cf"} err="failed to get container status \"731d3dc76040af2cf0e22089c23c8414fa451c0a6cc354c1c3fcb83f8aaed9cf\": rpc error: code = NotFound desc = could not find container \"731d3dc76040af2cf0e22089c23c8414fa451c0a6cc354c1c3fcb83f8aaed9cf\": container with ID starting with 731d3dc76040af2cf0e22089c23c8414fa451c0a6cc354c1c3fcb83f8aaed9cf not found: ID does not exist" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.359634 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-internal-tls-certs\") pod \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.359816 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-combined-ca-bundle\") pod \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.359849 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-public-tls-certs\") pod \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.359889 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kbjm\" (UniqueName: \"kubernetes.io/projected/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-kube-api-access-5kbjm\") pod \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.359927 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-config-data\") pod \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.359982 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-logs\") pod \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\" (UID: \"ce2547ac-9d8b-4542-b5d0-c76fd27e3442\") " Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.360764 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-logs" (OuterVolumeSpecName: "logs") pod "ce2547ac-9d8b-4542-b5d0-c76fd27e3442" (UID: "ce2547ac-9d8b-4542-b5d0-c76fd27e3442"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.366235 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-kube-api-access-5kbjm" (OuterVolumeSpecName: "kube-api-access-5kbjm") pod "ce2547ac-9d8b-4542-b5d0-c76fd27e3442" (UID: "ce2547ac-9d8b-4542-b5d0-c76fd27e3442"). InnerVolumeSpecName "kube-api-access-5kbjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.387468 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce2547ac-9d8b-4542-b5d0-c76fd27e3442" (UID: "ce2547ac-9d8b-4542-b5d0-c76fd27e3442"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.390707 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-config-data" (OuterVolumeSpecName: "config-data") pod "ce2547ac-9d8b-4542-b5d0-c76fd27e3442" (UID: "ce2547ac-9d8b-4542-b5d0-c76fd27e3442"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.409052 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ce2547ac-9d8b-4542-b5d0-c76fd27e3442" (UID: "ce2547ac-9d8b-4542-b5d0-c76fd27e3442"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.414419 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ce2547ac-9d8b-4542-b5d0-c76fd27e3442" (UID: "ce2547ac-9d8b-4542-b5d0-c76fd27e3442"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.462362 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.462398 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.462408 4580 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.462421 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.462429 4580 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.462440 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kbjm\" (UniqueName: \"kubernetes.io/projected/ce2547ac-9d8b-4542-b5d0-c76fd27e3442-kube-api-access-5kbjm\") on node \"crc\" DevicePath \"\"" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.632687 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.634751 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.656771 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 05:17:05 crc kubenswrapper[4580]: E0321 05:17:05.657474 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce2547ac-9d8b-4542-b5d0-c76fd27e3442" containerName="nova-api-api" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.657615 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce2547ac-9d8b-4542-b5d0-c76fd27e3442" containerName="nova-api-api" Mar 21 05:17:05 crc kubenswrapper[4580]: E0321 05:17:05.657679 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce2547ac-9d8b-4542-b5d0-c76fd27e3442" containerName="nova-api-log" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.657739 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce2547ac-9d8b-4542-b5d0-c76fd27e3442" containerName="nova-api-log" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.658066 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce2547ac-9d8b-4542-b5d0-c76fd27e3442" containerName="nova-api-api" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.658144 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce2547ac-9d8b-4542-b5d0-c76fd27e3442" containerName="nova-api-log" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.659413 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.669574 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.669575 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.669574 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.686652 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.891997 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae93ee16-d710-434d-b070-65215d559dfb-logs\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.892050 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae93ee16-d710-434d-b070-65215d559dfb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.892193 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcvwr\" (UniqueName: \"kubernetes.io/projected/ae93ee16-d710-434d-b070-65215d559dfb-kube-api-access-rcvwr\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.892243 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae93ee16-d710-434d-b070-65215d559dfb-config-data\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.892272 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae93ee16-d710-434d-b070-65215d559dfb-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.892288 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae93ee16-d710-434d-b070-65215d559dfb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.994722 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcvwr\" (UniqueName: \"kubernetes.io/projected/ae93ee16-d710-434d-b070-65215d559dfb-kube-api-access-rcvwr\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.994825 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae93ee16-d710-434d-b070-65215d559dfb-config-data\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.994850 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae93ee16-d710-434d-b070-65215d559dfb-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.994875 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae93ee16-d710-434d-b070-65215d559dfb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.994954 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae93ee16-d710-434d-b070-65215d559dfb-logs\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.995008 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae93ee16-d710-434d-b070-65215d559dfb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:05 crc kubenswrapper[4580]: I0321 05:17:05.998705 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae93ee16-d710-434d-b070-65215d559dfb-logs\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:06 crc kubenswrapper[4580]: I0321 05:17:06.003617 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae93ee16-d710-434d-b070-65215d559dfb-config-data\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:06 crc kubenswrapper[4580]: I0321 05:17:06.004205 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae93ee16-d710-434d-b070-65215d559dfb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:06 crc kubenswrapper[4580]: I0321 05:17:06.004712 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae93ee16-d710-434d-b070-65215d559dfb-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:06 crc kubenswrapper[4580]: I0321 05:17:06.015110 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae93ee16-d710-434d-b070-65215d559dfb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:06 crc kubenswrapper[4580]: I0321 05:17:06.028283 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcvwr\" (UniqueName: \"kubernetes.io/projected/ae93ee16-d710-434d-b070-65215d559dfb-kube-api-access-rcvwr\") pod \"nova-api-0\" (UID: \"ae93ee16-d710-434d-b070-65215d559dfb\") " pod="openstack/nova-api-0" Mar 21 05:17:06 crc kubenswrapper[4580]: I0321 05:17:06.246980 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:17:06 crc kubenswrapper[4580]: I0321 05:17:06.719048 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:17:07 crc kubenswrapper[4580]: I0321 05:17:07.306871 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae93ee16-d710-434d-b070-65215d559dfb","Type":"ContainerStarted","Data":"c1510fbbee960b5118a091fbf176f4d028b4b315040e0d65ede4bf1a4ae932f9"} Mar 21 05:17:07 crc kubenswrapper[4580]: I0321 05:17:07.307119 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae93ee16-d710-434d-b070-65215d559dfb","Type":"ContainerStarted","Data":"7f9778702e90e7023c255e89ecddf21accfad840f4b5275d2039feb17d686621"} Mar 21 05:17:07 crc kubenswrapper[4580]: I0321 05:17:07.631313 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce2547ac-9d8b-4542-b5d0-c76fd27e3442" path="/var/lib/kubelet/pods/ce2547ac-9d8b-4542-b5d0-c76fd27e3442/volumes" Mar 21 05:17:08 crc kubenswrapper[4580]: I0321 05:17:08.317614 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae93ee16-d710-434d-b070-65215d559dfb","Type":"ContainerStarted","Data":"68e9753cde34c414e6b7c54f318be315cf36d71b6f42ad3119f58a3f7f9f36e3"} Mar 21 05:17:08 crc kubenswrapper[4580]: I0321 05:17:08.354658 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.354639012 podStartE2EDuration="3.354639012s" podCreationTimestamp="2026-03-21 05:17:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:17:08.338672584 +0000 UTC m=+1533.421256222" watchObservedRunningTime="2026-03-21 05:17:08.354639012 +0000 UTC m=+1533.437222640" Mar 21 05:17:08 crc kubenswrapper[4580]: I0321 05:17:08.887759 4580 scope.go:117] "RemoveContainer" containerID="d8eb3f19c2c28fc41332dfdb63caf37a56dfbdabc374580e3c0b9f4828d22d4b" Mar 21 05:17:09 crc kubenswrapper[4580]: I0321 05:17:09.424299 4580 scope.go:117] "RemoveContainer" containerID="6022b48e0c0eb828b77506fcdd071acca3447e151624bfca742cdcf31c26cd9b" Mar 21 05:17:09 crc kubenswrapper[4580]: I0321 05:17:09.589993 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 21 05:17:09 crc kubenswrapper[4580]: I0321 05:17:09.631681 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 21 05:17:10 crc kubenswrapper[4580]: I0321 05:17:10.359544 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 21 05:17:12 crc kubenswrapper[4580]: I0321 05:17:12.626037 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 05:17:12 crc kubenswrapper[4580]: I0321 05:17:12.626551 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 05:17:13 crc kubenswrapper[4580]: I0321 05:17:13.641977 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="984d329d-aa14-46fb-9c9f-c5f9eb415f73" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:17:13 crc kubenswrapper[4580]: I0321 05:17:13.642000 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="984d329d-aa14-46fb-9c9f-c5f9eb415f73" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:17:14 crc kubenswrapper[4580]: I0321 05:17:14.135199 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-khbvh" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="registry-server" probeResult="failure" output=< Mar 21 05:17:14 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:17:14 crc kubenswrapper[4580]: > Mar 21 05:17:15 crc kubenswrapper[4580]: I0321 05:17:15.947716 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:17:15 crc kubenswrapper[4580]: I0321 05:17:15.947793 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:17:16 crc kubenswrapper[4580]: I0321 05:17:16.247453 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 05:17:16 crc kubenswrapper[4580]: I0321 05:17:16.248047 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 05:17:17 crc kubenswrapper[4580]: I0321 05:17:17.258917 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ae93ee16-d710-434d-b070-65215d559dfb" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 05:17:17 crc kubenswrapper[4580]: I0321 05:17:17.258958 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ae93ee16-d710-434d-b070-65215d559dfb" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 05:17:18 crc kubenswrapper[4580]: I0321 05:17:18.451890 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 21 05:17:20 crc kubenswrapper[4580]: I0321 05:17:20.626428 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 05:17:20 crc kubenswrapper[4580]: I0321 05:17:20.626478 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 05:17:22 crc kubenswrapper[4580]: I0321 05:17:22.461406 4580 generic.go:334] "Generic (PLEG): container finished" podID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerID="8dec5c044edc1705690a7eafd8a1c1f2fb3f54df8a14dee933c7e1786ce58f44" exitCode=137 Mar 21 05:17:22 crc kubenswrapper[4580]: I0321 05:17:22.461498 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67655f8b6-mbx6n" event={"ID":"a03ce0fa-f7e8-4b48-bbea-95807f14dd26","Type":"ContainerDied","Data":"8dec5c044edc1705690a7eafd8a1c1f2fb3f54df8a14dee933c7e1786ce58f44"} Mar 21 05:17:22 crc kubenswrapper[4580]: I0321 05:17:22.463213 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67655f8b6-mbx6n" event={"ID":"a03ce0fa-f7e8-4b48-bbea-95807f14dd26","Type":"ContainerStarted","Data":"6519242949dd0cabe5a08c98ee03aad7ef56ceea6d27c8d851246ddd1e628bac"} Mar 21 05:17:22 crc kubenswrapper[4580]: I0321 05:17:22.463315 4580 scope.go:117] "RemoveContainer" containerID="f660370a5b85c0757c978411d4b13c5ed188b23f7b881d8e81f31c5eac41a537" Mar 21 05:17:22 crc kubenswrapper[4580]: I0321 05:17:22.467542 4580 generic.go:334] "Generic (PLEG): container finished" podID="08a0110f-428a-481d-b439-bc16e6837dc3" containerID="1da18f5c3b92ba4606247cf9c331bf50366459fc361b2b8533adb37a81f66e54" exitCode=137 Mar 21 05:17:22 crc kubenswrapper[4580]: I0321 05:17:22.467583 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587cfc8688-265kc" event={"ID":"08a0110f-428a-481d-b439-bc16e6837dc3","Type":"ContainerDied","Data":"1da18f5c3b92ba4606247cf9c331bf50366459fc361b2b8533adb37a81f66e54"} Mar 21 05:17:22 crc kubenswrapper[4580]: I0321 05:17:22.467674 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587cfc8688-265kc" event={"ID":"08a0110f-428a-481d-b439-bc16e6837dc3","Type":"ContainerStarted","Data":"3bf4b5d0a95ef2b6685c815a591aa035969914cf931427758f4b1e5e49483521"} Mar 21 05:17:22 crc kubenswrapper[4580]: I0321 05:17:22.630669 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 05:17:22 crc kubenswrapper[4580]: I0321 05:17:22.631701 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 05:17:22 crc kubenswrapper[4580]: I0321 05:17:22.635287 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 05:17:22 crc kubenswrapper[4580]: I0321 05:17:22.672509 4580 scope.go:117] "RemoveContainer" containerID="aabef473a7fedba8211603363e3d7574d3a24bbc5ab5b0fe74504ddddca72333" Mar 21 05:17:23 crc kubenswrapper[4580]: I0321 05:17:23.488764 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 05:17:24 crc kubenswrapper[4580]: I0321 05:17:24.136093 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-khbvh" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="registry-server" probeResult="failure" output=< Mar 21 05:17:24 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:17:24 crc kubenswrapper[4580]: > Mar 21 05:17:24 crc kubenswrapper[4580]: I0321 05:17:24.247677 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 05:17:24 crc kubenswrapper[4580]: I0321 05:17:24.248230 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 05:17:26 crc kubenswrapper[4580]: I0321 05:17:26.259811 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 05:17:26 crc kubenswrapper[4580]: I0321 05:17:26.266716 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 05:17:26 crc kubenswrapper[4580]: I0321 05:17:26.270890 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 05:17:26 crc kubenswrapper[4580]: I0321 05:17:26.518034 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 05:17:31 crc kubenswrapper[4580]: I0321 05:17:31.393455 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:17:31 crc kubenswrapper[4580]: I0321 05:17:31.393880 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:17:31 crc kubenswrapper[4580]: I0321 05:17:31.396987 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:17:31 crc kubenswrapper[4580]: I0321 05:17:31.507300 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:17:31 crc kubenswrapper[4580]: I0321 05:17:31.508406 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:17:31 crc kubenswrapper[4580]: I0321 05:17:31.510325 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 21 05:17:34 crc kubenswrapper[4580]: I0321 05:17:34.142025 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-khbvh" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="registry-server" probeResult="failure" output=< Mar 21 05:17:34 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:17:34 crc kubenswrapper[4580]: > Mar 21 05:17:41 crc kubenswrapper[4580]: I0321 05:17:41.393554 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:17:41 crc kubenswrapper[4580]: I0321 05:17:41.507998 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67655f8b6-mbx6n" podUID="a03ce0fa-f7e8-4b48-bbea-95807f14dd26" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 21 05:17:44 crc kubenswrapper[4580]: I0321 05:17:44.155389 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-khbvh" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="registry-server" probeResult="failure" output=< Mar 21 05:17:44 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:17:44 crc kubenswrapper[4580]: > Mar 21 05:17:45 crc kubenswrapper[4580]: I0321 05:17:45.947283 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:17:45 crc kubenswrapper[4580]: I0321 05:17:45.947578 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:17:53 crc kubenswrapper[4580]: I0321 05:17:53.135866 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:17:53 crc kubenswrapper[4580]: I0321 05:17:53.189522 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:17:53 crc kubenswrapper[4580]: I0321 05:17:53.377856 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-khbvh"] Mar 21 05:17:54 crc kubenswrapper[4580]: I0321 05:17:54.285630 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:17:54 crc kubenswrapper[4580]: I0321 05:17:54.329081 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:17:54 crc kubenswrapper[4580]: I0321 05:17:54.751501 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-khbvh" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="registry-server" containerID="cri-o://5974694a009b237df8238ef61eb6abede38a0a3f7dd81f1f4c12e21a4d08c570" gracePeriod=2 Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.214235 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.249744 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c46a62-0353-48cd-8aa9-d23f3fb2e000-catalog-content\") pod \"79c46a62-0353-48cd-8aa9-d23f3fb2e000\" (UID: \"79c46a62-0353-48cd-8aa9-d23f3fb2e000\") " Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.250013 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c46a62-0353-48cd-8aa9-d23f3fb2e000-utilities\") pod \"79c46a62-0353-48cd-8aa9-d23f3fb2e000\" (UID: \"79c46a62-0353-48cd-8aa9-d23f3fb2e000\") " Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.250040 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmlc9\" (UniqueName: \"kubernetes.io/projected/79c46a62-0353-48cd-8aa9-d23f3fb2e000-kube-api-access-nmlc9\") pod \"79c46a62-0353-48cd-8aa9-d23f3fb2e000\" (UID: \"79c46a62-0353-48cd-8aa9-d23f3fb2e000\") " Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.250471 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79c46a62-0353-48cd-8aa9-d23f3fb2e000-utilities" (OuterVolumeSpecName: "utilities") pod "79c46a62-0353-48cd-8aa9-d23f3fb2e000" (UID: "79c46a62-0353-48cd-8aa9-d23f3fb2e000"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.257492 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79c46a62-0353-48cd-8aa9-d23f3fb2e000-kube-api-access-nmlc9" (OuterVolumeSpecName: "kube-api-access-nmlc9") pod "79c46a62-0353-48cd-8aa9-d23f3fb2e000" (UID: "79c46a62-0353-48cd-8aa9-d23f3fb2e000"). InnerVolumeSpecName "kube-api-access-nmlc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.351452 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c46a62-0353-48cd-8aa9-d23f3fb2e000-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.351485 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmlc9\" (UniqueName: \"kubernetes.io/projected/79c46a62-0353-48cd-8aa9-d23f3fb2e000-kube-api-access-nmlc9\") on node \"crc\" DevicePath \"\"" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.379110 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79c46a62-0353-48cd-8aa9-d23f3fb2e000-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79c46a62-0353-48cd-8aa9-d23f3fb2e000" (UID: "79c46a62-0353-48cd-8aa9-d23f3fb2e000"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.453999 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c46a62-0353-48cd-8aa9-d23f3fb2e000-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.764006 4580 generic.go:334] "Generic (PLEG): container finished" podID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerID="5974694a009b237df8238ef61eb6abede38a0a3f7dd81f1f4c12e21a4d08c570" exitCode=0 Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.764056 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khbvh" event={"ID":"79c46a62-0353-48cd-8aa9-d23f3fb2e000","Type":"ContainerDied","Data":"5974694a009b237df8238ef61eb6abede38a0a3f7dd81f1f4c12e21a4d08c570"} Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.764089 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khbvh" event={"ID":"79c46a62-0353-48cd-8aa9-d23f3fb2e000","Type":"ContainerDied","Data":"275587f7f0db2cefee8ec2c3643f7b8294518d126112109740542e6f98522cf7"} Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.764111 4580 scope.go:117] "RemoveContainer" containerID="5974694a009b237df8238ef61eb6abede38a0a3f7dd81f1f4c12e21a4d08c570" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.764202 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khbvh" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.788882 4580 scope.go:117] "RemoveContainer" containerID="3eaf33415ff84dc8582fb75f7b658a58f4831e77d9d7931f76adc6aba6dd3735" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.797684 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-khbvh"] Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.809016 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-khbvh"] Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.815365 4580 scope.go:117] "RemoveContainer" containerID="3584b7d891fb48aaccc82e7f27f929124e903ac471bf4157fb2950d6ceca0b5d" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.851323 4580 scope.go:117] "RemoveContainer" containerID="5974694a009b237df8238ef61eb6abede38a0a3f7dd81f1f4c12e21a4d08c570" Mar 21 05:17:55 crc kubenswrapper[4580]: E0321 05:17:55.852169 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5974694a009b237df8238ef61eb6abede38a0a3f7dd81f1f4c12e21a4d08c570\": container with ID starting with 5974694a009b237df8238ef61eb6abede38a0a3f7dd81f1f4c12e21a4d08c570 not found: ID does not exist" containerID="5974694a009b237df8238ef61eb6abede38a0a3f7dd81f1f4c12e21a4d08c570" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.852206 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5974694a009b237df8238ef61eb6abede38a0a3f7dd81f1f4c12e21a4d08c570"} err="failed to get container status \"5974694a009b237df8238ef61eb6abede38a0a3f7dd81f1f4c12e21a4d08c570\": rpc error: code = NotFound desc = could not find container \"5974694a009b237df8238ef61eb6abede38a0a3f7dd81f1f4c12e21a4d08c570\": container with ID starting with 5974694a009b237df8238ef61eb6abede38a0a3f7dd81f1f4c12e21a4d08c570 not found: ID does not exist" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.852227 4580 scope.go:117] "RemoveContainer" containerID="3eaf33415ff84dc8582fb75f7b658a58f4831e77d9d7931f76adc6aba6dd3735" Mar 21 05:17:55 crc kubenswrapper[4580]: E0321 05:17:55.852664 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eaf33415ff84dc8582fb75f7b658a58f4831e77d9d7931f76adc6aba6dd3735\": container with ID starting with 3eaf33415ff84dc8582fb75f7b658a58f4831e77d9d7931f76adc6aba6dd3735 not found: ID does not exist" containerID="3eaf33415ff84dc8582fb75f7b658a58f4831e77d9d7931f76adc6aba6dd3735" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.852688 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eaf33415ff84dc8582fb75f7b658a58f4831e77d9d7931f76adc6aba6dd3735"} err="failed to get container status \"3eaf33415ff84dc8582fb75f7b658a58f4831e77d9d7931f76adc6aba6dd3735\": rpc error: code = NotFound desc = could not find container \"3eaf33415ff84dc8582fb75f7b658a58f4831e77d9d7931f76adc6aba6dd3735\": container with ID starting with 3eaf33415ff84dc8582fb75f7b658a58f4831e77d9d7931f76adc6aba6dd3735 not found: ID does not exist" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.852701 4580 scope.go:117] "RemoveContainer" containerID="3584b7d891fb48aaccc82e7f27f929124e903ac471bf4157fb2950d6ceca0b5d" Mar 21 05:17:55 crc kubenswrapper[4580]: E0321 05:17:55.853019 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3584b7d891fb48aaccc82e7f27f929124e903ac471bf4157fb2950d6ceca0b5d\": container with ID starting with 3584b7d891fb48aaccc82e7f27f929124e903ac471bf4157fb2950d6ceca0b5d not found: ID does not exist" containerID="3584b7d891fb48aaccc82e7f27f929124e903ac471bf4157fb2950d6ceca0b5d" Mar 21 05:17:55 crc kubenswrapper[4580]: I0321 05:17:55.853062 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3584b7d891fb48aaccc82e7f27f929124e903ac471bf4157fb2950d6ceca0b5d"} err="failed to get container status \"3584b7d891fb48aaccc82e7f27f929124e903ac471bf4157fb2950d6ceca0b5d\": rpc error: code = NotFound desc = could not find container \"3584b7d891fb48aaccc82e7f27f929124e903ac471bf4157fb2950d6ceca0b5d\": container with ID starting with 3584b7d891fb48aaccc82e7f27f929124e903ac471bf4157fb2950d6ceca0b5d not found: ID does not exist" Mar 21 05:17:56 crc kubenswrapper[4580]: I0321 05:17:56.311257 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:17:56 crc kubenswrapper[4580]: I0321 05:17:56.380005 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-67655f8b6-mbx6n" Mar 21 05:17:56 crc kubenswrapper[4580]: I0321 05:17:56.456749 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-587cfc8688-265kc"] Mar 21 05:17:56 crc kubenswrapper[4580]: I0321 05:17:56.773534 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon-log" containerID="cri-o://7573b50ebc5ac682fcca653fb89d61d20bbd5e002d97c910776fa487a5d85059" gracePeriod=30 Mar 21 05:17:56 crc kubenswrapper[4580]: I0321 05:17:56.773598 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" containerID="cri-o://3bf4b5d0a95ef2b6685c815a591aa035969914cf931427758f4b1e5e49483521" gracePeriod=30 Mar 21 05:17:57 crc kubenswrapper[4580]: I0321 05:17:57.629016 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" path="/var/lib/kubelet/pods/79c46a62-0353-48cd-8aa9-d23f3fb2e000/volumes" Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.147462 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567838-4qndx"] Mar 21 05:18:00 crc kubenswrapper[4580]: E0321 05:18:00.148640 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="extract-utilities" Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.148656 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="extract-utilities" Mar 21 05:18:00 crc kubenswrapper[4580]: E0321 05:18:00.148666 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="extract-content" Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.148673 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="extract-content" Mar 21 05:18:00 crc kubenswrapper[4580]: E0321 05:18:00.148687 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="registry-server" Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.148695 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="registry-server" Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.148934 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="79c46a62-0353-48cd-8aa9-d23f3fb2e000" containerName="registry-server" Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.149699 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-4qndx" Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.152013 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.152254 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.154547 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.158889 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-4qndx"] Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.255420 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g29f\" (UniqueName: \"kubernetes.io/projected/7b0c3fc8-6255-45ef-9192-4160848b545a-kube-api-access-6g29f\") pod \"auto-csr-approver-29567838-4qndx\" (UID: \"7b0c3fc8-6255-45ef-9192-4160848b545a\") " pod="openshift-infra/auto-csr-approver-29567838-4qndx" Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.356967 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g29f\" (UniqueName: \"kubernetes.io/projected/7b0c3fc8-6255-45ef-9192-4160848b545a-kube-api-access-6g29f\") pod \"auto-csr-approver-29567838-4qndx\" (UID: \"7b0c3fc8-6255-45ef-9192-4160848b545a\") " pod="openshift-infra/auto-csr-approver-29567838-4qndx" Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.376634 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g29f\" (UniqueName: \"kubernetes.io/projected/7b0c3fc8-6255-45ef-9192-4160848b545a-kube-api-access-6g29f\") pod \"auto-csr-approver-29567838-4qndx\" (UID: \"7b0c3fc8-6255-45ef-9192-4160848b545a\") " pod="openshift-infra/auto-csr-approver-29567838-4qndx" Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.488940 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-4qndx" Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.811656 4580 generic.go:334] "Generic (PLEG): container finished" podID="08a0110f-428a-481d-b439-bc16e6837dc3" containerID="3bf4b5d0a95ef2b6685c815a591aa035969914cf931427758f4b1e5e49483521" exitCode=0 Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.811748 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587cfc8688-265kc" event={"ID":"08a0110f-428a-481d-b439-bc16e6837dc3","Type":"ContainerDied","Data":"3bf4b5d0a95ef2b6685c815a591aa035969914cf931427758f4b1e5e49483521"} Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.811998 4580 scope.go:117] "RemoveContainer" containerID="1da18f5c3b92ba4606247cf9c331bf50366459fc361b2b8533adb37a81f66e54" Mar 21 05:18:00 crc kubenswrapper[4580]: I0321 05:18:00.941164 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-4qndx"] Mar 21 05:18:01 crc kubenswrapper[4580]: I0321 05:18:01.000112 4580 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:18:01 crc kubenswrapper[4580]: I0321 05:18:01.395019 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:18:01 crc kubenswrapper[4580]: I0321 05:18:01.824271 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567838-4qndx" event={"ID":"7b0c3fc8-6255-45ef-9192-4160848b545a","Type":"ContainerStarted","Data":"30348a3f6bf7c796252a0b2edb8d8c64db34ab28f2b6358dec532c22591037e0"} Mar 21 05:18:02 crc kubenswrapper[4580]: I0321 05:18:02.834685 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567838-4qndx" event={"ID":"7b0c3fc8-6255-45ef-9192-4160848b545a","Type":"ContainerStarted","Data":"80003388c09c49a1e50de6ecfae462a1a23aa4e34d1c67cab721f187e4fb3b28"} Mar 21 05:18:02 crc kubenswrapper[4580]: I0321 05:18:02.867295 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567838-4qndx" podStartSLOduration=1.526006251 podStartE2EDuration="2.86727142s" podCreationTimestamp="2026-03-21 05:18:00 +0000 UTC" firstStartedPulling="2026-03-21 05:18:00.999848331 +0000 UTC m=+1586.082431959" lastFinishedPulling="2026-03-21 05:18:02.3411135 +0000 UTC m=+1587.423697128" observedRunningTime="2026-03-21 05:18:02.855396017 +0000 UTC m=+1587.937979655" watchObservedRunningTime="2026-03-21 05:18:02.86727142 +0000 UTC m=+1587.949855058" Mar 21 05:18:03 crc kubenswrapper[4580]: I0321 05:18:03.847774 4580 generic.go:334] "Generic (PLEG): container finished" podID="7b0c3fc8-6255-45ef-9192-4160848b545a" containerID="80003388c09c49a1e50de6ecfae462a1a23aa4e34d1c67cab721f187e4fb3b28" exitCode=0 Mar 21 05:18:03 crc kubenswrapper[4580]: I0321 05:18:03.847913 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567838-4qndx" event={"ID":"7b0c3fc8-6255-45ef-9192-4160848b545a","Type":"ContainerDied","Data":"80003388c09c49a1e50de6ecfae462a1a23aa4e34d1c67cab721f187e4fb3b28"} Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.205138 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.224369 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-4qndx" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.362418 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g29f\" (UniqueName: \"kubernetes.io/projected/7b0c3fc8-6255-45ef-9192-4160848b545a-kube-api-access-6g29f\") pod \"7b0c3fc8-6255-45ef-9192-4160848b545a\" (UID: \"7b0c3fc8-6255-45ef-9192-4160848b545a\") " Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.367932 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0c3fc8-6255-45ef-9192-4160848b545a-kube-api-access-6g29f" (OuterVolumeSpecName: "kube-api-access-6g29f") pod "7b0c3fc8-6255-45ef-9192-4160848b545a" (UID: "7b0c3fc8-6255-45ef-9192-4160848b545a"). InnerVolumeSpecName "kube-api-access-6g29f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.464897 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g29f\" (UniqueName: \"kubernetes.io/projected/7b0c3fc8-6255-45ef-9192-4160848b545a-kube-api-access-6g29f\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.759805 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x942b"] Mar 21 05:18:05 crc kubenswrapper[4580]: E0321 05:18:05.762534 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0c3fc8-6255-45ef-9192-4160848b545a" containerName="oc" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.762606 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0c3fc8-6255-45ef-9192-4160848b545a" containerName="oc" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.762922 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b0c3fc8-6255-45ef-9192-4160848b545a" containerName="oc" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.764259 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.789859 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x942b"] Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.863827 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-4qndx" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.863773 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567838-4qndx" event={"ID":"7b0c3fc8-6255-45ef-9192-4160848b545a","Type":"ContainerDied","Data":"30348a3f6bf7c796252a0b2edb8d8c64db34ab28f2b6358dec532c22591037e0"} Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.863889 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30348a3f6bf7c796252a0b2edb8d8c64db34ab28f2b6358dec532c22591037e0" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.875893 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgszn\" (UniqueName: \"kubernetes.io/projected/7107b04b-c7b1-46bd-8031-20e907129f4b-kube-api-access-wgszn\") pod \"certified-operators-x942b\" (UID: \"7107b04b-c7b1-46bd-8031-20e907129f4b\") " pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.876164 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7107b04b-c7b1-46bd-8031-20e907129f4b-catalog-content\") pod \"certified-operators-x942b\" (UID: \"7107b04b-c7b1-46bd-8031-20e907129f4b\") " pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.876305 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7107b04b-c7b1-46bd-8031-20e907129f4b-utilities\") pod \"certified-operators-x942b\" (UID: \"7107b04b-c7b1-46bd-8031-20e907129f4b\") " pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.923880 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-4mr87"] Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.934218 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-4mr87"] Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.978605 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7107b04b-c7b1-46bd-8031-20e907129f4b-utilities\") pod \"certified-operators-x942b\" (UID: \"7107b04b-c7b1-46bd-8031-20e907129f4b\") " pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.978860 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7107b04b-c7b1-46bd-8031-20e907129f4b-utilities\") pod \"certified-operators-x942b\" (UID: \"7107b04b-c7b1-46bd-8031-20e907129f4b\") " pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.979121 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgszn\" (UniqueName: \"kubernetes.io/projected/7107b04b-c7b1-46bd-8031-20e907129f4b-kube-api-access-wgszn\") pod \"certified-operators-x942b\" (UID: \"7107b04b-c7b1-46bd-8031-20e907129f4b\") " pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.979286 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7107b04b-c7b1-46bd-8031-20e907129f4b-catalog-content\") pod \"certified-operators-x942b\" (UID: \"7107b04b-c7b1-46bd-8031-20e907129f4b\") " pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:05 crc kubenswrapper[4580]: I0321 05:18:05.979505 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7107b04b-c7b1-46bd-8031-20e907129f4b-catalog-content\") pod \"certified-operators-x942b\" (UID: \"7107b04b-c7b1-46bd-8031-20e907129f4b\") " pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:06 crc kubenswrapper[4580]: I0321 05:18:06.000515 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgszn\" (UniqueName: \"kubernetes.io/projected/7107b04b-c7b1-46bd-8031-20e907129f4b-kube-api-access-wgszn\") pod \"certified-operators-x942b\" (UID: \"7107b04b-c7b1-46bd-8031-20e907129f4b\") " pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:06 crc kubenswrapper[4580]: I0321 05:18:06.083245 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:06 crc kubenswrapper[4580]: I0321 05:18:06.465508 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x942b"] Mar 21 05:18:06 crc kubenswrapper[4580]: I0321 05:18:06.574631 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:18:06 crc kubenswrapper[4580]: I0321 05:18:06.874212 4580 generic.go:334] "Generic (PLEG): container finished" podID="7107b04b-c7b1-46bd-8031-20e907129f4b" containerID="fb815858d542f4470b6036174c149b8d9ae68cf66a6aa750b8700ec8abf0c9c6" exitCode=0 Mar 21 05:18:06 crc kubenswrapper[4580]: I0321 05:18:06.874250 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x942b" event={"ID":"7107b04b-c7b1-46bd-8031-20e907129f4b","Type":"ContainerDied","Data":"fb815858d542f4470b6036174c149b8d9ae68cf66a6aa750b8700ec8abf0c9c6"} Mar 21 05:18:06 crc kubenswrapper[4580]: I0321 05:18:06.874274 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x942b" event={"ID":"7107b04b-c7b1-46bd-8031-20e907129f4b","Type":"ContainerStarted","Data":"0b71d65c07c32d4a24eb9c4b34a3dc949e708c73fa4a378473ac592a7c3bae18"} Mar 21 05:18:07 crc kubenswrapper[4580]: I0321 05:18:07.632438 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="567de937-7644-4d4f-a759-f74e483dfe3d" path="/var/lib/kubelet/pods/567de937-7644-4d4f-a759-f74e483dfe3d/volumes" Mar 21 05:18:07 crc kubenswrapper[4580]: I0321 05:18:07.885519 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x942b" event={"ID":"7107b04b-c7b1-46bd-8031-20e907129f4b","Type":"ContainerStarted","Data":"c5cd0e7b1bf9d86084421c7ea76ddb4dfb471d69cf457eba107f000bf5e96a99"} Mar 21 05:18:09 crc kubenswrapper[4580]: I0321 05:18:09.731974 4580 scope.go:117] "RemoveContainer" containerID="bd8c14b7665fc06a8d12af5e3eba0ab892bd9185f2214c4c239e2a5951ee7b42" Mar 21 05:18:09 crc kubenswrapper[4580]: I0321 05:18:09.768890 4580 scope.go:117] "RemoveContainer" containerID="00e2791c81bdea6deaa93b6a032b985c2babac1e542835fab1777a5a7677ca19" Mar 21 05:18:10 crc kubenswrapper[4580]: I0321 05:18:10.393548 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 21 05:18:10 crc kubenswrapper[4580]: I0321 05:18:10.488670 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" containerName="rabbitmq" containerID="cri-o://87e95c71e5b4c69e0f416f42c75f627726c658bcb7ae591e1214f6008e48d864" gracePeriod=604795 Mar 21 05:18:10 crc kubenswrapper[4580]: I0321 05:18:10.931862 4580 generic.go:334] "Generic (PLEG): container finished" podID="7107b04b-c7b1-46bd-8031-20e907129f4b" containerID="c5cd0e7b1bf9d86084421c7ea76ddb4dfb471d69cf457eba107f000bf5e96a99" exitCode=0 Mar 21 05:18:10 crc kubenswrapper[4580]: I0321 05:18:10.931931 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x942b" event={"ID":"7107b04b-c7b1-46bd-8031-20e907129f4b","Type":"ContainerDied","Data":"c5cd0e7b1bf9d86084421c7ea76ddb4dfb471d69cf457eba107f000bf5e96a99"} Mar 21 05:18:11 crc kubenswrapper[4580]: I0321 05:18:11.394474 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:18:12 crc kubenswrapper[4580]: I0321 05:18:12.408237 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ac0ed353-d343-4f14-804b-affb2f0cc4d6" containerName="rabbitmq" containerID="cri-o://86a71490e6445977417c6007cecc2a18237aebb0844bdc2aa10212685e5f69b1" gracePeriod=604795 Mar 21 05:18:12 crc kubenswrapper[4580]: I0321 05:18:12.958373 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x942b" event={"ID":"7107b04b-c7b1-46bd-8031-20e907129f4b","Type":"ContainerStarted","Data":"020bf45a0d5f3a825abdb9adbfc117b7b24220c5ae4992b4fee451c445d0d3b9"} Mar 21 05:18:12 crc kubenswrapper[4580]: I0321 05:18:12.992188 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x942b" podStartSLOduration=3.017292013 podStartE2EDuration="7.992163771s" podCreationTimestamp="2026-03-21 05:18:05 +0000 UTC" firstStartedPulling="2026-03-21 05:18:06.87660418 +0000 UTC m=+1591.959187808" lastFinishedPulling="2026-03-21 05:18:11.851475938 +0000 UTC m=+1596.934059566" observedRunningTime="2026-03-21 05:18:12.983188439 +0000 UTC m=+1598.065772077" watchObservedRunningTime="2026-03-21 05:18:12.992163771 +0000 UTC m=+1598.074747399" Mar 21 05:18:15 crc kubenswrapper[4580]: I0321 05:18:15.948248 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:18:15 crc kubenswrapper[4580]: I0321 05:18:15.948825 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:18:15 crc kubenswrapper[4580]: I0321 05:18:15.948872 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 05:18:15 crc kubenswrapper[4580]: I0321 05:18:15.949657 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0b67d8190c897455e564af68d56eb7f7f1eabacada737f44e7b09e47464a936"} pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:18:15 crc kubenswrapper[4580]: I0321 05:18:15.949713 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" containerID="cri-o://b0b67d8190c897455e564af68d56eb7f7f1eabacada737f44e7b09e47464a936" gracePeriod=600 Mar 21 05:18:16 crc kubenswrapper[4580]: I0321 05:18:16.091028 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:16 crc kubenswrapper[4580]: I0321 05:18:16.091248 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.017606 4580 generic.go:334] "Generic (PLEG): container finished" podID="38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" containerID="87e95c71e5b4c69e0f416f42c75f627726c658bcb7ae591e1214f6008e48d864" exitCode=0 Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.017752 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf","Type":"ContainerDied","Data":"87e95c71e5b4c69e0f416f42c75f627726c658bcb7ae591e1214f6008e48d864"} Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.019598 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf","Type":"ContainerDied","Data":"f1c8689579ebf583eb3fef071adea86e79fbb3c5249e649fa8a7df56e847fde0"} Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.019618 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1c8689579ebf583eb3fef071adea86e79fbb3c5249e649fa8a7df56e847fde0" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.026497 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerID="b0b67d8190c897455e564af68d56eb7f7f1eabacada737f44e7b09e47464a936" exitCode=0 Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.026612 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerDied","Data":"b0b67d8190c897455e564af68d56eb7f7f1eabacada737f44e7b09e47464a936"} Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.026668 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45"} Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.027325 4580 scope.go:117] "RemoveContainer" containerID="0008875a2f7ef6e2119165dc1e0e253e98f01735aec210fb18c6ffa1eebbb281" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.053637 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.148806 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-x942b" podUID="7107b04b-c7b1-46bd-8031-20e907129f4b" containerName="registry-server" probeResult="failure" output=< Mar 21 05:18:17 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:18:17 crc kubenswrapper[4580]: > Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.229476 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-server-conf\") pod \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.229545 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-plugins-conf\") pod \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.229594 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-pod-info\") pod \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.229660 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-tls\") pod \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.229696 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-erlang-cookie-secret\") pod \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.229753 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-config-data\") pod \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.229863 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.229892 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-confd\") pod \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.229983 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-plugins\") pod \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.230036 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-erlang-cookie\") pod \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.230076 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px6ls\" (UniqueName: \"kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-kube-api-access-px6ls\") pod \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\" (UID: \"38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf\") " Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.233235 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" (UID: "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.244299 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" (UID: "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.244958 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" (UID: "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.248166 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-pod-info" (OuterVolumeSpecName: "pod-info") pod "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" (UID: "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.260240 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-kube-api-access-px6ls" (OuterVolumeSpecName: "kube-api-access-px6ls") pod "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" (UID: "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf"). InnerVolumeSpecName "kube-api-access-px6ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.271127 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" (UID: "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.271271 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" (UID: "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.311625 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" (UID: "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.312134 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-config-data" (OuterVolumeSpecName: "config-data") pod "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" (UID: "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.338048 4580 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.338178 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px6ls\" (UniqueName: \"kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-kube-api-access-px6ls\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.338249 4580 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.338318 4580 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-pod-info\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.338394 4580 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.338460 4580 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.338535 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.338617 4580 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.338684 4580 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.414280 4580 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.440924 4580 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.481819 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-server-conf" (OuterVolumeSpecName: "server-conf") pod "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" (UID: "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.538463 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" (UID: "38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.542593 4580 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:17 crc kubenswrapper[4580]: I0321 05:18:17.542622 4580 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf-server-conf\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.037373 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.067934 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.079948 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.106559 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:18:18 crc kubenswrapper[4580]: E0321 05:18:18.106953 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" containerName="rabbitmq" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.106984 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" containerName="rabbitmq" Mar 21 05:18:18 crc kubenswrapper[4580]: E0321 05:18:18.107026 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" containerName="setup-container" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.107032 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" containerName="setup-container" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.108165 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" containerName="rabbitmq" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.141254 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.149361 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.150960 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.151373 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.151884 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.152010 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-h94bp" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.152295 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.151898 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.173530 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.262092 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/364da597-ba18-4d63-b1be-1d925e603515-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.262603 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/364da597-ba18-4d63-b1be-1d925e603515-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.262717 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/364da597-ba18-4d63-b1be-1d925e603515-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.262817 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg7z4\" (UniqueName: \"kubernetes.io/projected/364da597-ba18-4d63-b1be-1d925e603515-kube-api-access-gg7z4\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.262907 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/364da597-ba18-4d63-b1be-1d925e603515-server-conf\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.263080 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/364da597-ba18-4d63-b1be-1d925e603515-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.263178 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.263327 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/364da597-ba18-4d63-b1be-1d925e603515-config-data\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.263462 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/364da597-ba18-4d63-b1be-1d925e603515-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.263694 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/364da597-ba18-4d63-b1be-1d925e603515-pod-info\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.263840 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/364da597-ba18-4d63-b1be-1d925e603515-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.365982 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/364da597-ba18-4d63-b1be-1d925e603515-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.366070 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/364da597-ba18-4d63-b1be-1d925e603515-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.366096 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg7z4\" (UniqueName: \"kubernetes.io/projected/364da597-ba18-4d63-b1be-1d925e603515-kube-api-access-gg7z4\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.366111 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/364da597-ba18-4d63-b1be-1d925e603515-server-conf\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.366151 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/364da597-ba18-4d63-b1be-1d925e603515-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.366169 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.366210 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/364da597-ba18-4d63-b1be-1d925e603515-config-data\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.366244 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/364da597-ba18-4d63-b1be-1d925e603515-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.366276 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/364da597-ba18-4d63-b1be-1d925e603515-pod-info\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.366300 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/364da597-ba18-4d63-b1be-1d925e603515-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.366348 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/364da597-ba18-4d63-b1be-1d925e603515-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.366859 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/364da597-ba18-4d63-b1be-1d925e603515-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.367154 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.367971 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/364da597-ba18-4d63-b1be-1d925e603515-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.368311 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/364da597-ba18-4d63-b1be-1d925e603515-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.371673 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/364da597-ba18-4d63-b1be-1d925e603515-pod-info\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.373163 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/364da597-ba18-4d63-b1be-1d925e603515-server-conf\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.373642 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/364da597-ba18-4d63-b1be-1d925e603515-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.374172 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/364da597-ba18-4d63-b1be-1d925e603515-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.382815 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/364da597-ba18-4d63-b1be-1d925e603515-config-data\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.392601 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/364da597-ba18-4d63-b1be-1d925e603515-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.399093 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg7z4\" (UniqueName: \"kubernetes.io/projected/364da597-ba18-4d63-b1be-1d925e603515-kube-api-access-gg7z4\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.428516 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"364da597-ba18-4d63-b1be-1d925e603515\") " pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.485644 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 05:18:18 crc kubenswrapper[4580]: I0321 05:18:18.967998 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.058252 4580 generic.go:334] "Generic (PLEG): container finished" podID="ac0ed353-d343-4f14-804b-affb2f0cc4d6" containerID="86a71490e6445977417c6007cecc2a18237aebb0844bdc2aa10212685e5f69b1" exitCode=0 Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.058308 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ac0ed353-d343-4f14-804b-affb2f0cc4d6","Type":"ContainerDied","Data":"86a71490e6445977417c6007cecc2a18237aebb0844bdc2aa10212685e5f69b1"} Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.058336 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ac0ed353-d343-4f14-804b-affb2f0cc4d6","Type":"ContainerDied","Data":"ff05058a223549e96d9ab65726160b47c29db63239e89bb285ba89a1a353f94a"} Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.058354 4580 scope.go:117] "RemoveContainer" containerID="86a71490e6445977417c6007cecc2a18237aebb0844bdc2aa10212685e5f69b1" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.058560 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.078391 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ac0ed353-d343-4f14-804b-affb2f0cc4d6-pod-info\") pod \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.078487 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-config-data\") pod \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.078511 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.078548 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-plugins\") pod \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.078642 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-plugins-conf\") pod \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.078672 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ac0ed353-d343-4f14-804b-affb2f0cc4d6-erlang-cookie-secret\") pod \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.078693 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-server-conf\") pod \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.078754 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsfc8\" (UniqueName: \"kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-kube-api-access-tsfc8\") pod \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.078812 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-confd\") pod \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.078844 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-erlang-cookie\") pod \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.078870 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-tls\") pod \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\" (UID: \"ac0ed353-d343-4f14-804b-affb2f0cc4d6\") " Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.079097 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ac0ed353-d343-4f14-804b-affb2f0cc4d6" (UID: "ac0ed353-d343-4f14-804b-affb2f0cc4d6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.079306 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ac0ed353-d343-4f14-804b-affb2f0cc4d6" (UID: "ac0ed353-d343-4f14-804b-affb2f0cc4d6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.079382 4580 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.079396 4580 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.087421 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-kube-api-access-tsfc8" (OuterVolumeSpecName: "kube-api-access-tsfc8") pod "ac0ed353-d343-4f14-804b-affb2f0cc4d6" (UID: "ac0ed353-d343-4f14-804b-affb2f0cc4d6"). InnerVolumeSpecName "kube-api-access-tsfc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.088634 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ac0ed353-d343-4f14-804b-affb2f0cc4d6" (UID: "ac0ed353-d343-4f14-804b-affb2f0cc4d6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.092805 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "ac0ed353-d343-4f14-804b-affb2f0cc4d6" (UID: "ac0ed353-d343-4f14-804b-affb2f0cc4d6"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.093252 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ac0ed353-d343-4f14-804b-affb2f0cc4d6-pod-info" (OuterVolumeSpecName: "pod-info") pod "ac0ed353-d343-4f14-804b-affb2f0cc4d6" (UID: "ac0ed353-d343-4f14-804b-affb2f0cc4d6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.095399 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ac0ed353-d343-4f14-804b-affb2f0cc4d6" (UID: "ac0ed353-d343-4f14-804b-affb2f0cc4d6"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.103395 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0ed353-d343-4f14-804b-affb2f0cc4d6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ac0ed353-d343-4f14-804b-affb2f0cc4d6" (UID: "ac0ed353-d343-4f14-804b-affb2f0cc4d6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.113649 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-config-data" (OuterVolumeSpecName: "config-data") pod "ac0ed353-d343-4f14-804b-affb2f0cc4d6" (UID: "ac0ed353-d343-4f14-804b-affb2f0cc4d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.115816 4580 scope.go:117] "RemoveContainer" containerID="517c3f090ecfb64bc2b909bb5db8ba938766cb21ee2ca89f76f4bc37007b4577" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.157552 4580 scope.go:117] "RemoveContainer" containerID="86a71490e6445977417c6007cecc2a18237aebb0844bdc2aa10212685e5f69b1" Mar 21 05:18:19 crc kubenswrapper[4580]: E0321 05:18:19.158621 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86a71490e6445977417c6007cecc2a18237aebb0844bdc2aa10212685e5f69b1\": container with ID starting with 86a71490e6445977417c6007cecc2a18237aebb0844bdc2aa10212685e5f69b1 not found: ID does not exist" containerID="86a71490e6445977417c6007cecc2a18237aebb0844bdc2aa10212685e5f69b1" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.158675 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86a71490e6445977417c6007cecc2a18237aebb0844bdc2aa10212685e5f69b1"} err="failed to get container status \"86a71490e6445977417c6007cecc2a18237aebb0844bdc2aa10212685e5f69b1\": rpc error: code = NotFound desc = could not find container \"86a71490e6445977417c6007cecc2a18237aebb0844bdc2aa10212685e5f69b1\": container with ID starting with 86a71490e6445977417c6007cecc2a18237aebb0844bdc2aa10212685e5f69b1 not found: ID does not exist" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.158704 4580 scope.go:117] "RemoveContainer" containerID="517c3f090ecfb64bc2b909bb5db8ba938766cb21ee2ca89f76f4bc37007b4577" Mar 21 05:18:19 crc kubenswrapper[4580]: E0321 05:18:19.160305 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"517c3f090ecfb64bc2b909bb5db8ba938766cb21ee2ca89f76f4bc37007b4577\": container with ID starting with 517c3f090ecfb64bc2b909bb5db8ba938766cb21ee2ca89f76f4bc37007b4577 not found: ID does not exist" containerID="517c3f090ecfb64bc2b909bb5db8ba938766cb21ee2ca89f76f4bc37007b4577" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.160349 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"517c3f090ecfb64bc2b909bb5db8ba938766cb21ee2ca89f76f4bc37007b4577"} err="failed to get container status \"517c3f090ecfb64bc2b909bb5db8ba938766cb21ee2ca89f76f4bc37007b4577\": rpc error: code = NotFound desc = could not find container \"517c3f090ecfb64bc2b909bb5db8ba938766cb21ee2ca89f76f4bc37007b4577\": container with ID starting with 517c3f090ecfb64bc2b909bb5db8ba938766cb21ee2ca89f76f4bc37007b4577 not found: ID does not exist" Mar 21 05:18:19 crc kubenswrapper[4580]: W0321 05:18:19.167567 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod364da597_ba18_4d63_b1be_1d925e603515.slice/crio-17a5a6d761ee79fb8bdb28fceb17a6d8a06cdf9589b43f021b8ca0b6e84458ec WatchSource:0}: Error finding container 17a5a6d761ee79fb8bdb28fceb17a6d8a06cdf9589b43f021b8ca0b6e84458ec: Status 404 returned error can't find the container with id 17a5a6d761ee79fb8bdb28fceb17a6d8a06cdf9589b43f021b8ca0b6e84458ec Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.170038 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.187125 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.187188 4580 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.187203 4580 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ac0ed353-d343-4f14-804b-affb2f0cc4d6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.187217 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsfc8\" (UniqueName: \"kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-kube-api-access-tsfc8\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.187231 4580 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.187243 4580 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.187254 4580 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ac0ed353-d343-4f14-804b-affb2f0cc4d6-pod-info\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.223075 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-server-conf" (OuterVolumeSpecName: "server-conf") pod "ac0ed353-d343-4f14-804b-affb2f0cc4d6" (UID: "ac0ed353-d343-4f14-804b-affb2f0cc4d6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.235215 4580 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.251936 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ac0ed353-d343-4f14-804b-affb2f0cc4d6" (UID: "ac0ed353-d343-4f14-804b-affb2f0cc4d6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.297338 4580 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ac0ed353-d343-4f14-804b-affb2f0cc4d6-server-conf\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.297643 4580 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ac0ed353-d343-4f14-804b-affb2f0cc4d6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.297653 4580 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.412173 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.426422 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.450066 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:18:19 crc kubenswrapper[4580]: E0321 05:18:19.450449 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0ed353-d343-4f14-804b-affb2f0cc4d6" containerName="setup-container" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.450465 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0ed353-d343-4f14-804b-affb2f0cc4d6" containerName="setup-container" Mar 21 05:18:19 crc kubenswrapper[4580]: E0321 05:18:19.450497 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0ed353-d343-4f14-804b-affb2f0cc4d6" containerName="rabbitmq" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.450503 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0ed353-d343-4f14-804b-affb2f0cc4d6" containerName="rabbitmq" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.450684 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac0ed353-d343-4f14-804b-affb2f0cc4d6" containerName="rabbitmq" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.451841 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.455372 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.455892 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.456365 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.456698 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.456800 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.456985 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.457620 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-9cjgq" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.478805 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.617277 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7619a3e5-e696-412d-8550-c8c30660eacd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.617371 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.617402 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7619a3e5-e696-412d-8550-c8c30660eacd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.617704 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7619a3e5-e696-412d-8550-c8c30660eacd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.617872 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7619a3e5-e696-412d-8550-c8c30660eacd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.617912 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzvbz\" (UniqueName: \"kubernetes.io/projected/7619a3e5-e696-412d-8550-c8c30660eacd-kube-api-access-xzvbz\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.617935 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7619a3e5-e696-412d-8550-c8c30660eacd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.617999 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7619a3e5-e696-412d-8550-c8c30660eacd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.618090 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7619a3e5-e696-412d-8550-c8c30660eacd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.618137 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7619a3e5-e696-412d-8550-c8c30660eacd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.618175 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7619a3e5-e696-412d-8550-c8c30660eacd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.632515 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf" path="/var/lib/kubelet/pods/38ef0f25-a572-4eaf-95ed-07b7f6ffaeaf/volumes" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.633691 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac0ed353-d343-4f14-804b-affb2f0cc4d6" path="/var/lib/kubelet/pods/ac0ed353-d343-4f14-804b-affb2f0cc4d6/volumes" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.720301 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzvbz\" (UniqueName: \"kubernetes.io/projected/7619a3e5-e696-412d-8550-c8c30660eacd-kube-api-access-xzvbz\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.720346 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7619a3e5-e696-412d-8550-c8c30660eacd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.720376 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7619a3e5-e696-412d-8550-c8c30660eacd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.720420 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7619a3e5-e696-412d-8550-c8c30660eacd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.720450 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7619a3e5-e696-412d-8550-c8c30660eacd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.720485 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7619a3e5-e696-412d-8550-c8c30660eacd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.720524 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7619a3e5-e696-412d-8550-c8c30660eacd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.720553 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.720575 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7619a3e5-e696-412d-8550-c8c30660eacd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.720631 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7619a3e5-e696-412d-8550-c8c30660eacd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.720663 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7619a3e5-e696-412d-8550-c8c30660eacd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.721127 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7619a3e5-e696-412d-8550-c8c30660eacd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.722700 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7619a3e5-e696-412d-8550-c8c30660eacd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.722897 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7619a3e5-e696-412d-8550-c8c30660eacd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.723017 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.723441 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7619a3e5-e696-412d-8550-c8c30660eacd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.723465 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7619a3e5-e696-412d-8550-c8c30660eacd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.727300 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7619a3e5-e696-412d-8550-c8c30660eacd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.728506 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7619a3e5-e696-412d-8550-c8c30660eacd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.731268 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7619a3e5-e696-412d-8550-c8c30660eacd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.734581 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7619a3e5-e696-412d-8550-c8c30660eacd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.742605 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzvbz\" (UniqueName: \"kubernetes.io/projected/7619a3e5-e696-412d-8550-c8c30660eacd-kube-api-access-xzvbz\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.759762 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7619a3e5-e696-412d-8550-c8c30660eacd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:19 crc kubenswrapper[4580]: I0321 05:18:19.768986 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:20 crc kubenswrapper[4580]: I0321 05:18:20.080187 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"364da597-ba18-4d63-b1be-1d925e603515","Type":"ContainerStarted","Data":"17a5a6d761ee79fb8bdb28fceb17a6d8a06cdf9589b43f021b8ca0b6e84458ec"} Mar 21 05:18:20 crc kubenswrapper[4580]: I0321 05:18:20.291537 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:18:20 crc kubenswrapper[4580]: W0321 05:18:20.297307 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7619a3e5_e696_412d_8550_c8c30660eacd.slice/crio-c95b04fb7ea80c4b6b1860cecf660cd6ca3caa3f5736a2c400b9fd7c19e92672 WatchSource:0}: Error finding container c95b04fb7ea80c4b6b1860cecf660cd6ca3caa3f5736a2c400b9fd7c19e92672: Status 404 returned error can't find the container with id c95b04fb7ea80c4b6b1860cecf660cd6ca3caa3f5736a2c400b9fd7c19e92672 Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.116382 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"364da597-ba18-4d63-b1be-1d925e603515","Type":"ContainerStarted","Data":"a0b281a901aee720ef6814669d4b7b8ab85d7d623a6e9299fae21611796c03e9"} Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.124527 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7619a3e5-e696-412d-8550-c8c30660eacd","Type":"ContainerStarted","Data":"c95b04fb7ea80c4b6b1860cecf660cd6ca3caa3f5736a2c400b9fd7c19e92672"} Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.394132 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-587cfc8688-265kc" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.394311 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.687474 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-g5bkj"] Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.689082 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.692091 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.694664 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-g5bkj"] Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.765727 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.766089 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.766115 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.766147 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.766162 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.766265 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-config\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.766300 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5677\" (UniqueName: \"kubernetes.io/projected/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-kube-api-access-h5677\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.868854 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.868949 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.868975 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.869023 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.869046 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.869098 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-config\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.869130 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5677\" (UniqueName: \"kubernetes.io/projected/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-kube-api-access-h5677\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.870007 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.870198 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.870210 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.870588 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.870765 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-config\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.871006 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:21 crc kubenswrapper[4580]: I0321 05:18:21.889557 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5677\" (UniqueName: \"kubernetes.io/projected/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-kube-api-access-h5677\") pod \"dnsmasq-dns-79bd4cc8c9-g5bkj\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:22 crc kubenswrapper[4580]: I0321 05:18:22.028647 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:22 crc kubenswrapper[4580]: I0321 05:18:22.141225 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7619a3e5-e696-412d-8550-c8c30660eacd","Type":"ContainerStarted","Data":"64e483f6fa7300712b22b13ab022b6111574e9a009507ddf05d874335bff7ecc"} Mar 21 05:18:22 crc kubenswrapper[4580]: I0321 05:18:22.508720 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-g5bkj"] Mar 21 05:18:23 crc kubenswrapper[4580]: I0321 05:18:23.152296 4580 generic.go:334] "Generic (PLEG): container finished" podID="4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" containerID="2eab12fc51c7687047d7574bae7497545b1a4920164f7a8c280ddef9a85dae0a" exitCode=0 Mar 21 05:18:23 crc kubenswrapper[4580]: I0321 05:18:23.152347 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" event={"ID":"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d","Type":"ContainerDied","Data":"2eab12fc51c7687047d7574bae7497545b1a4920164f7a8c280ddef9a85dae0a"} Mar 21 05:18:23 crc kubenswrapper[4580]: I0321 05:18:23.152867 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" event={"ID":"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d","Type":"ContainerStarted","Data":"5bd0fedec3f1664010577a9d8f6fc19356113962febd3ac312f9c1f58e12f8cf"} Mar 21 05:18:24 crc kubenswrapper[4580]: I0321 05:18:24.162742 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" event={"ID":"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d","Type":"ContainerStarted","Data":"9b5b5712f7779b3901e848f165f14a894ec4b2f795609dfd4ca432d624fa5f82"} Mar 21 05:18:24 crc kubenswrapper[4580]: I0321 05:18:24.164073 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:24 crc kubenswrapper[4580]: I0321 05:18:24.186736 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" podStartSLOduration=3.186719478 podStartE2EDuration="3.186719478s" podCreationTimestamp="2026-03-21 05:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:18:24.181903273 +0000 UTC m=+1609.264486901" watchObservedRunningTime="2026-03-21 05:18:24.186719478 +0000 UTC m=+1609.269303106" Mar 21 05:18:26 crc kubenswrapper[4580]: I0321 05:18:26.128176 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:26 crc kubenswrapper[4580]: I0321 05:18:26.178690 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:26 crc kubenswrapper[4580]: I0321 05:18:26.415763 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x942b"] Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.193921 4580 generic.go:334] "Generic (PLEG): container finished" podID="08a0110f-428a-481d-b439-bc16e6837dc3" containerID="7573b50ebc5ac682fcca653fb89d61d20bbd5e002d97c910776fa487a5d85059" exitCode=137 Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.194947 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x942b" podUID="7107b04b-c7b1-46bd-8031-20e907129f4b" containerName="registry-server" containerID="cri-o://020bf45a0d5f3a825abdb9adbfc117b7b24220c5ae4992b4fee451c445d0d3b9" gracePeriod=2 Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.194004 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587cfc8688-265kc" event={"ID":"08a0110f-428a-481d-b439-bc16e6837dc3","Type":"ContainerDied","Data":"7573b50ebc5ac682fcca653fb89d61d20bbd5e002d97c910776fa487a5d85059"} Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.568425 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.657182 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.678188 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z44z\" (UniqueName: \"kubernetes.io/projected/08a0110f-428a-481d-b439-bc16e6837dc3-kube-api-access-2z44z\") pod \"08a0110f-428a-481d-b439-bc16e6837dc3\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.678459 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08a0110f-428a-481d-b439-bc16e6837dc3-config-data\") pod \"08a0110f-428a-481d-b439-bc16e6837dc3\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.678516 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-horizon-tls-certs\") pod \"08a0110f-428a-481d-b439-bc16e6837dc3\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.678577 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a0110f-428a-481d-b439-bc16e6837dc3-logs\") pod \"08a0110f-428a-481d-b439-bc16e6837dc3\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.678630 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08a0110f-428a-481d-b439-bc16e6837dc3-scripts\") pod \"08a0110f-428a-481d-b439-bc16e6837dc3\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.678730 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-horizon-secret-key\") pod \"08a0110f-428a-481d-b439-bc16e6837dc3\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.678753 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-combined-ca-bundle\") pod \"08a0110f-428a-481d-b439-bc16e6837dc3\" (UID: \"08a0110f-428a-481d-b439-bc16e6837dc3\") " Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.682092 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a0110f-428a-481d-b439-bc16e6837dc3-logs" (OuterVolumeSpecName: "logs") pod "08a0110f-428a-481d-b439-bc16e6837dc3" (UID: "08a0110f-428a-481d-b439-bc16e6837dc3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.701995 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a0110f-428a-481d-b439-bc16e6837dc3-kube-api-access-2z44z" (OuterVolumeSpecName: "kube-api-access-2z44z") pod "08a0110f-428a-481d-b439-bc16e6837dc3" (UID: "08a0110f-428a-481d-b439-bc16e6837dc3"). InnerVolumeSpecName "kube-api-access-2z44z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.710080 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "08a0110f-428a-481d-b439-bc16e6837dc3" (UID: "08a0110f-428a-481d-b439-bc16e6837dc3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.719698 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08a0110f-428a-481d-b439-bc16e6837dc3-config-data" (OuterVolumeSpecName: "config-data") pod "08a0110f-428a-481d-b439-bc16e6837dc3" (UID: "08a0110f-428a-481d-b439-bc16e6837dc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.730543 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08a0110f-428a-481d-b439-bc16e6837dc3" (UID: "08a0110f-428a-481d-b439-bc16e6837dc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.735951 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08a0110f-428a-481d-b439-bc16e6837dc3-scripts" (OuterVolumeSpecName: "scripts") pod "08a0110f-428a-481d-b439-bc16e6837dc3" (UID: "08a0110f-428a-481d-b439-bc16e6837dc3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.762473 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "08a0110f-428a-481d-b439-bc16e6837dc3" (UID: "08a0110f-428a-481d-b439-bc16e6837dc3"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.781465 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgszn\" (UniqueName: \"kubernetes.io/projected/7107b04b-c7b1-46bd-8031-20e907129f4b-kube-api-access-wgszn\") pod \"7107b04b-c7b1-46bd-8031-20e907129f4b\" (UID: \"7107b04b-c7b1-46bd-8031-20e907129f4b\") " Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.782238 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7107b04b-c7b1-46bd-8031-20e907129f4b-utilities\") pod \"7107b04b-c7b1-46bd-8031-20e907129f4b\" (UID: \"7107b04b-c7b1-46bd-8031-20e907129f4b\") " Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.782440 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7107b04b-c7b1-46bd-8031-20e907129f4b-catalog-content\") pod \"7107b04b-c7b1-46bd-8031-20e907129f4b\" (UID: \"7107b04b-c7b1-46bd-8031-20e907129f4b\") " Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.783121 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7107b04b-c7b1-46bd-8031-20e907129f4b-utilities" (OuterVolumeSpecName: "utilities") pod "7107b04b-c7b1-46bd-8031-20e907129f4b" (UID: "7107b04b-c7b1-46bd-8031-20e907129f4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.785298 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7107b04b-c7b1-46bd-8031-20e907129f4b-kube-api-access-wgszn" (OuterVolumeSpecName: "kube-api-access-wgszn") pod "7107b04b-c7b1-46bd-8031-20e907129f4b" (UID: "7107b04b-c7b1-46bd-8031-20e907129f4b"). InnerVolumeSpecName "kube-api-access-wgszn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.790650 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7107b04b-c7b1-46bd-8031-20e907129f4b-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.791845 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08a0110f-428a-481d-b439-bc16e6837dc3-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.791981 4580 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.792043 4580 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a0110f-428a-481d-b439-bc16e6837dc3-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.792104 4580 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08a0110f-428a-481d-b439-bc16e6837dc3-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.792157 4580 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.792213 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a0110f-428a-481d-b439-bc16e6837dc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.792265 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgszn\" (UniqueName: \"kubernetes.io/projected/7107b04b-c7b1-46bd-8031-20e907129f4b-kube-api-access-wgszn\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.792314 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z44z\" (UniqueName: \"kubernetes.io/projected/08a0110f-428a-481d-b439-bc16e6837dc3-kube-api-access-2z44z\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.842388 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7107b04b-c7b1-46bd-8031-20e907129f4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7107b04b-c7b1-46bd-8031-20e907129f4b" (UID: "7107b04b-c7b1-46bd-8031-20e907129f4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:18:27 crc kubenswrapper[4580]: I0321 05:18:27.894227 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7107b04b-c7b1-46bd-8031-20e907129f4b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.205638 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587cfc8688-265kc" event={"ID":"08a0110f-428a-481d-b439-bc16e6837dc3","Type":"ContainerDied","Data":"3bf837f5f84a7e9166c2c26039f4d5a0f408f7ddf6a3b2f18ae00b5fe9399791"} Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.205697 4580 scope.go:117] "RemoveContainer" containerID="3bf4b5d0a95ef2b6685c815a591aa035969914cf931427758f4b1e5e49483521" Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.205960 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587cfc8688-265kc" Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.209120 4580 generic.go:334] "Generic (PLEG): container finished" podID="7107b04b-c7b1-46bd-8031-20e907129f4b" containerID="020bf45a0d5f3a825abdb9adbfc117b7b24220c5ae4992b4fee451c445d0d3b9" exitCode=0 Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.209197 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x942b" event={"ID":"7107b04b-c7b1-46bd-8031-20e907129f4b","Type":"ContainerDied","Data":"020bf45a0d5f3a825abdb9adbfc117b7b24220c5ae4992b4fee451c445d0d3b9"} Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.209265 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x942b" Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.209284 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x942b" event={"ID":"7107b04b-c7b1-46bd-8031-20e907129f4b","Type":"ContainerDied","Data":"0b71d65c07c32d4a24eb9c4b34a3dc949e708c73fa4a378473ac592a7c3bae18"} Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.254910 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x942b"] Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.266774 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x942b"] Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.274730 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-587cfc8688-265kc"] Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.283166 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-587cfc8688-265kc"] Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.375360 4580 scope.go:117] "RemoveContainer" containerID="7573b50ebc5ac682fcca653fb89d61d20bbd5e002d97c910776fa487a5d85059" Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.416014 4580 scope.go:117] "RemoveContainer" containerID="020bf45a0d5f3a825abdb9adbfc117b7b24220c5ae4992b4fee451c445d0d3b9" Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.441958 4580 scope.go:117] "RemoveContainer" containerID="c5cd0e7b1bf9d86084421c7ea76ddb4dfb471d69cf457eba107f000bf5e96a99" Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.463769 4580 scope.go:117] "RemoveContainer" containerID="fb815858d542f4470b6036174c149b8d9ae68cf66a6aa750b8700ec8abf0c9c6" Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.542034 4580 scope.go:117] "RemoveContainer" containerID="020bf45a0d5f3a825abdb9adbfc117b7b24220c5ae4992b4fee451c445d0d3b9" Mar 21 05:18:28 crc kubenswrapper[4580]: E0321 05:18:28.542712 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"020bf45a0d5f3a825abdb9adbfc117b7b24220c5ae4992b4fee451c445d0d3b9\": container with ID starting with 020bf45a0d5f3a825abdb9adbfc117b7b24220c5ae4992b4fee451c445d0d3b9 not found: ID does not exist" containerID="020bf45a0d5f3a825abdb9adbfc117b7b24220c5ae4992b4fee451c445d0d3b9" Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.542766 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"020bf45a0d5f3a825abdb9adbfc117b7b24220c5ae4992b4fee451c445d0d3b9"} err="failed to get container status \"020bf45a0d5f3a825abdb9adbfc117b7b24220c5ae4992b4fee451c445d0d3b9\": rpc error: code = NotFound desc = could not find container \"020bf45a0d5f3a825abdb9adbfc117b7b24220c5ae4992b4fee451c445d0d3b9\": container with ID starting with 020bf45a0d5f3a825abdb9adbfc117b7b24220c5ae4992b4fee451c445d0d3b9 not found: ID does not exist" Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.542841 4580 scope.go:117] "RemoveContainer" containerID="c5cd0e7b1bf9d86084421c7ea76ddb4dfb471d69cf457eba107f000bf5e96a99" Mar 21 05:18:28 crc kubenswrapper[4580]: E0321 05:18:28.543333 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5cd0e7b1bf9d86084421c7ea76ddb4dfb471d69cf457eba107f000bf5e96a99\": container with ID starting with c5cd0e7b1bf9d86084421c7ea76ddb4dfb471d69cf457eba107f000bf5e96a99 not found: ID does not exist" containerID="c5cd0e7b1bf9d86084421c7ea76ddb4dfb471d69cf457eba107f000bf5e96a99" Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.543371 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5cd0e7b1bf9d86084421c7ea76ddb4dfb471d69cf457eba107f000bf5e96a99"} err="failed to get container status \"c5cd0e7b1bf9d86084421c7ea76ddb4dfb471d69cf457eba107f000bf5e96a99\": rpc error: code = NotFound desc = could not find container \"c5cd0e7b1bf9d86084421c7ea76ddb4dfb471d69cf457eba107f000bf5e96a99\": container with ID starting with c5cd0e7b1bf9d86084421c7ea76ddb4dfb471d69cf457eba107f000bf5e96a99 not found: ID does not exist" Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.543421 4580 scope.go:117] "RemoveContainer" containerID="fb815858d542f4470b6036174c149b8d9ae68cf66a6aa750b8700ec8abf0c9c6" Mar 21 05:18:28 crc kubenswrapper[4580]: E0321 05:18:28.543890 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb815858d542f4470b6036174c149b8d9ae68cf66a6aa750b8700ec8abf0c9c6\": container with ID starting with fb815858d542f4470b6036174c149b8d9ae68cf66a6aa750b8700ec8abf0c9c6 not found: ID does not exist" containerID="fb815858d542f4470b6036174c149b8d9ae68cf66a6aa750b8700ec8abf0c9c6" Mar 21 05:18:28 crc kubenswrapper[4580]: I0321 05:18:28.543941 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb815858d542f4470b6036174c149b8d9ae68cf66a6aa750b8700ec8abf0c9c6"} err="failed to get container status \"fb815858d542f4470b6036174c149b8d9ae68cf66a6aa750b8700ec8abf0c9c6\": rpc error: code = NotFound desc = could not find container \"fb815858d542f4470b6036174c149b8d9ae68cf66a6aa750b8700ec8abf0c9c6\": container with ID starting with fb815858d542f4470b6036174c149b8d9ae68cf66a6aa750b8700ec8abf0c9c6 not found: ID does not exist" Mar 21 05:18:29 crc kubenswrapper[4580]: I0321 05:18:29.631203 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" path="/var/lib/kubelet/pods/08a0110f-428a-481d-b439-bc16e6837dc3/volumes" Mar 21 05:18:29 crc kubenswrapper[4580]: I0321 05:18:29.632381 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7107b04b-c7b1-46bd-8031-20e907129f4b" path="/var/lib/kubelet/pods/7107b04b-c7b1-46bd-8031-20e907129f4b/volumes" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.030015 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.116587 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-bbsh9"] Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.116854 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" podUID="9d663dd5-6647-4ea3-b927-cbe5b3d9edf2" containerName="dnsmasq-dns" containerID="cri-o://71cebe7d6a47d8643893c0cb772804c24bb21b8b91d02e5c707b606715a4b9ec" gracePeriod=10 Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.263104 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cd9bffc9-mrwrr"] Mar 21 05:18:32 crc kubenswrapper[4580]: E0321 05:18:32.263482 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7107b04b-c7b1-46bd-8031-20e907129f4b" containerName="registry-server" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.263500 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7107b04b-c7b1-46bd-8031-20e907129f4b" containerName="registry-server" Mar 21 05:18:32 crc kubenswrapper[4580]: E0321 05:18:32.263510 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7107b04b-c7b1-46bd-8031-20e907129f4b" containerName="extract-utilities" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.263517 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7107b04b-c7b1-46bd-8031-20e907129f4b" containerName="extract-utilities" Mar 21 05:18:32 crc kubenswrapper[4580]: E0321 05:18:32.263536 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.263542 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" Mar 21 05:18:32 crc kubenswrapper[4580]: E0321 05:18:32.263552 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.263558 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" Mar 21 05:18:32 crc kubenswrapper[4580]: E0321 05:18:32.263568 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.263573 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" Mar 21 05:18:32 crc kubenswrapper[4580]: E0321 05:18:32.263593 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7107b04b-c7b1-46bd-8031-20e907129f4b" containerName="extract-content" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.263599 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7107b04b-c7b1-46bd-8031-20e907129f4b" containerName="extract-content" Mar 21 05:18:32 crc kubenswrapper[4580]: E0321 05:18:32.263612 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon-log" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.263617 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon-log" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.263834 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.263851 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="7107b04b-c7b1-46bd-8031-20e907129f4b" containerName="registry-server" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.263873 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.263883 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon-log" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.263897 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" Mar 21 05:18:32 crc kubenswrapper[4580]: E0321 05:18:32.264066 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.264081 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" Mar 21 05:18:32 crc kubenswrapper[4580]: E0321 05:18:32.264092 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.264098 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.264367 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.265343 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.279777 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd9bffc9-mrwrr"] Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.401020 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk8jg\" (UniqueName: \"kubernetes.io/projected/0a878571-91e7-486e-8258-fc3298a5e03f-kube-api-access-bk8jg\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.401080 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.401152 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.401180 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-dns-svc\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.401270 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-openstack-edpm-ipam\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.401333 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-dns-swift-storage-0\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.401381 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-config\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.502924 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk8jg\" (UniqueName: \"kubernetes.io/projected/0a878571-91e7-486e-8258-fc3298a5e03f-kube-api-access-bk8jg\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.502972 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.503664 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.503722 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.503751 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-dns-svc\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.504227 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.504280 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-dns-svc\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.504364 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-openstack-edpm-ipam\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.504870 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-openstack-edpm-ipam\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.505004 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-dns-swift-storage-0\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.505575 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-dns-swift-storage-0\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.505634 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-config\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.506170 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a878571-91e7-486e-8258-fc3298a5e03f-config\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.527794 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk8jg\" (UniqueName: \"kubernetes.io/projected/0a878571-91e7-486e-8258-fc3298a5e03f-kube-api-access-bk8jg\") pod \"dnsmasq-dns-6cd9bffc9-mrwrr\" (UID: \"0a878571-91e7-486e-8258-fc3298a5e03f\") " pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:32 crc kubenswrapper[4580]: I0321 05:18:32.599975 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.084489 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd9bffc9-mrwrr"] Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.149821 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.290066 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.290092 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" event={"ID":"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2","Type":"ContainerDied","Data":"71cebe7d6a47d8643893c0cb772804c24bb21b8b91d02e5c707b606715a4b9ec"} Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.290052 4580 generic.go:334] "Generic (PLEG): container finished" podID="9d663dd5-6647-4ea3-b927-cbe5b3d9edf2" containerID="71cebe7d6a47d8643893c0cb772804c24bb21b8b91d02e5c707b606715a4b9ec" exitCode=0 Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.290850 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-bbsh9" event={"ID":"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2","Type":"ContainerDied","Data":"68466636260c8e64b587c41d470194bd7daf9bdfd69f4c343522ba817fe53290"} Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.290902 4580 scope.go:117] "RemoveContainer" containerID="71cebe7d6a47d8643893c0cb772804c24bb21b8b91d02e5c707b606715a4b9ec" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.293538 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" event={"ID":"0a878571-91e7-486e-8258-fc3298a5e03f","Type":"ContainerStarted","Data":"5ea949747d4131ebe081f4b8d2122b0f2c702bafd84eb6862d5b9089b7ab273c"} Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.327297 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-dns-swift-storage-0\") pod \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.327803 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-ovsdbserver-sb\") pod \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.328115 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-dns-svc\") pod \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.328167 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-ovsdbserver-nb\") pod \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.328202 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf9fw\" (UniqueName: \"kubernetes.io/projected/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-kube-api-access-hf9fw\") pod \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.328277 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-config\") pod \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\" (UID: \"9d663dd5-6647-4ea3-b927-cbe5b3d9edf2\") " Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.357387 4580 scope.go:117] "RemoveContainer" containerID="d644423c4850932b8a427715707ec77b6c8db397a6132a9cfb01435364ce07ce" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.357479 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-kube-api-access-hf9fw" (OuterVolumeSpecName: "kube-api-access-hf9fw") pod "9d663dd5-6647-4ea3-b927-cbe5b3d9edf2" (UID: "9d663dd5-6647-4ea3-b927-cbe5b3d9edf2"). InnerVolumeSpecName "kube-api-access-hf9fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.430500 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf9fw\" (UniqueName: \"kubernetes.io/projected/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-kube-api-access-hf9fw\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.494240 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9d663dd5-6647-4ea3-b927-cbe5b3d9edf2" (UID: "9d663dd5-6647-4ea3-b927-cbe5b3d9edf2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.521164 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-config" (OuterVolumeSpecName: "config") pod "9d663dd5-6647-4ea3-b927-cbe5b3d9edf2" (UID: "9d663dd5-6647-4ea3-b927-cbe5b3d9edf2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.523143 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9d663dd5-6647-4ea3-b927-cbe5b3d9edf2" (UID: "9d663dd5-6647-4ea3-b927-cbe5b3d9edf2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.524453 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d663dd5-6647-4ea3-b927-cbe5b3d9edf2" (UID: "9d663dd5-6647-4ea3-b927-cbe5b3d9edf2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.532582 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.532609 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.532619 4580 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.532629 4580 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.566945 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9d663dd5-6647-4ea3-b927-cbe5b3d9edf2" (UID: "9d663dd5-6647-4ea3-b927-cbe5b3d9edf2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.619637 4580 scope.go:117] "RemoveContainer" containerID="71cebe7d6a47d8643893c0cb772804c24bb21b8b91d02e5c707b606715a4b9ec" Mar 21 05:18:33 crc kubenswrapper[4580]: E0321 05:18:33.620006 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71cebe7d6a47d8643893c0cb772804c24bb21b8b91d02e5c707b606715a4b9ec\": container with ID starting with 71cebe7d6a47d8643893c0cb772804c24bb21b8b91d02e5c707b606715a4b9ec not found: ID does not exist" containerID="71cebe7d6a47d8643893c0cb772804c24bb21b8b91d02e5c707b606715a4b9ec" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.620047 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71cebe7d6a47d8643893c0cb772804c24bb21b8b91d02e5c707b606715a4b9ec"} err="failed to get container status \"71cebe7d6a47d8643893c0cb772804c24bb21b8b91d02e5c707b606715a4b9ec\": rpc error: code = NotFound desc = could not find container \"71cebe7d6a47d8643893c0cb772804c24bb21b8b91d02e5c707b606715a4b9ec\": container with ID starting with 71cebe7d6a47d8643893c0cb772804c24bb21b8b91d02e5c707b606715a4b9ec not found: ID does not exist" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.620068 4580 scope.go:117] "RemoveContainer" containerID="d644423c4850932b8a427715707ec77b6c8db397a6132a9cfb01435364ce07ce" Mar 21 05:18:33 crc kubenswrapper[4580]: E0321 05:18:33.620260 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d644423c4850932b8a427715707ec77b6c8db397a6132a9cfb01435364ce07ce\": container with ID starting with d644423c4850932b8a427715707ec77b6c8db397a6132a9cfb01435364ce07ce not found: ID does not exist" containerID="d644423c4850932b8a427715707ec77b6c8db397a6132a9cfb01435364ce07ce" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.620285 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d644423c4850932b8a427715707ec77b6c8db397a6132a9cfb01435364ce07ce"} err="failed to get container status \"d644423c4850932b8a427715707ec77b6c8db397a6132a9cfb01435364ce07ce\": rpc error: code = NotFound desc = could not find container \"d644423c4850932b8a427715707ec77b6c8db397a6132a9cfb01435364ce07ce\": container with ID starting with d644423c4850932b8a427715707ec77b6c8db397a6132a9cfb01435364ce07ce not found: ID does not exist" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.634560 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.635529 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-bbsh9"] Mar 21 05:18:33 crc kubenswrapper[4580]: I0321 05:18:33.643930 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-bbsh9"] Mar 21 05:18:34 crc kubenswrapper[4580]: I0321 05:18:34.302765 4580 generic.go:334] "Generic (PLEG): container finished" podID="0a878571-91e7-486e-8258-fc3298a5e03f" containerID="9692c9470e086d90400a4dca911956d5b0395cebdf1b32193c7c49f928edc8eb" exitCode=0 Mar 21 05:18:34 crc kubenswrapper[4580]: I0321 05:18:34.302872 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" event={"ID":"0a878571-91e7-486e-8258-fc3298a5e03f","Type":"ContainerDied","Data":"9692c9470e086d90400a4dca911956d5b0395cebdf1b32193c7c49f928edc8eb"} Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.317736 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" event={"ID":"0a878571-91e7-486e-8258-fc3298a5e03f","Type":"ContainerStarted","Data":"5d82d64e2df4aa0ca9c0f469dfbc82bc0cef77d4fc698f40c7ffc7070fa788bc"} Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.318442 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.340414 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" podStartSLOduration=3.3403911490000002 podStartE2EDuration="3.340391149s" podCreationTimestamp="2026-03-21 05:18:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:18:35.339237287 +0000 UTC m=+1620.421820945" watchObservedRunningTime="2026-03-21 05:18:35.340391149 +0000 UTC m=+1620.422974777" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.628698 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d663dd5-6647-4ea3-b927-cbe5b3d9edf2" path="/var/lib/kubelet/pods/9d663dd5-6647-4ea3-b927-cbe5b3d9edf2/volumes" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.774387 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7lcs4"] Mar 21 05:18:35 crc kubenswrapper[4580]: E0321 05:18:35.774834 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d663dd5-6647-4ea3-b927-cbe5b3d9edf2" containerName="init" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.774852 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d663dd5-6647-4ea3-b927-cbe5b3d9edf2" containerName="init" Mar 21 05:18:35 crc kubenswrapper[4580]: E0321 05:18:35.774876 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d663dd5-6647-4ea3-b927-cbe5b3d9edf2" containerName="dnsmasq-dns" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.774885 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d663dd5-6647-4ea3-b927-cbe5b3d9edf2" containerName="dnsmasq-dns" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.775110 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d663dd5-6647-4ea3-b927-cbe5b3d9edf2" containerName="dnsmasq-dns" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.775124 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a0110f-428a-481d-b439-bc16e6837dc3" containerName="horizon" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.776426 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.801047 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7lcs4"] Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.874018 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c09e7728-e209-4919-b430-5d30477ba537-catalog-content\") pod \"community-operators-7lcs4\" (UID: \"c09e7728-e209-4919-b430-5d30477ba537\") " pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.874122 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c09e7728-e209-4919-b430-5d30477ba537-utilities\") pod \"community-operators-7lcs4\" (UID: \"c09e7728-e209-4919-b430-5d30477ba537\") " pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.874230 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq5zx\" (UniqueName: \"kubernetes.io/projected/c09e7728-e209-4919-b430-5d30477ba537-kube-api-access-gq5zx\") pod \"community-operators-7lcs4\" (UID: \"c09e7728-e209-4919-b430-5d30477ba537\") " pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.976108 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c09e7728-e209-4919-b430-5d30477ba537-utilities\") pod \"community-operators-7lcs4\" (UID: \"c09e7728-e209-4919-b430-5d30477ba537\") " pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.976236 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq5zx\" (UniqueName: \"kubernetes.io/projected/c09e7728-e209-4919-b430-5d30477ba537-kube-api-access-gq5zx\") pod \"community-operators-7lcs4\" (UID: \"c09e7728-e209-4919-b430-5d30477ba537\") " pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.976267 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c09e7728-e209-4919-b430-5d30477ba537-catalog-content\") pod \"community-operators-7lcs4\" (UID: \"c09e7728-e209-4919-b430-5d30477ba537\") " pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.976770 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c09e7728-e209-4919-b430-5d30477ba537-utilities\") pod \"community-operators-7lcs4\" (UID: \"c09e7728-e209-4919-b430-5d30477ba537\") " pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:35 crc kubenswrapper[4580]: I0321 05:18:35.976843 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c09e7728-e209-4919-b430-5d30477ba537-catalog-content\") pod \"community-operators-7lcs4\" (UID: \"c09e7728-e209-4919-b430-5d30477ba537\") " pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:36 crc kubenswrapper[4580]: I0321 05:18:36.006497 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq5zx\" (UniqueName: \"kubernetes.io/projected/c09e7728-e209-4919-b430-5d30477ba537-kube-api-access-gq5zx\") pod \"community-operators-7lcs4\" (UID: \"c09e7728-e209-4919-b430-5d30477ba537\") " pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:36 crc kubenswrapper[4580]: I0321 05:18:36.096238 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:36 crc kubenswrapper[4580]: W0321 05:18:36.468811 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc09e7728_e209_4919_b430_5d30477ba537.slice/crio-08e7ae4386cee4c2d91c52c9458ac3885aabcfa8cd9c595d27bb1127eadb062b WatchSource:0}: Error finding container 08e7ae4386cee4c2d91c52c9458ac3885aabcfa8cd9c595d27bb1127eadb062b: Status 404 returned error can't find the container with id 08e7ae4386cee4c2d91c52c9458ac3885aabcfa8cd9c595d27bb1127eadb062b Mar 21 05:18:36 crc kubenswrapper[4580]: I0321 05:18:36.502455 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7lcs4"] Mar 21 05:18:37 crc kubenswrapper[4580]: I0321 05:18:37.335892 4580 generic.go:334] "Generic (PLEG): container finished" podID="c09e7728-e209-4919-b430-5d30477ba537" containerID="df017601c65349cbce90658dfcb58cc441adb03e7c35549c0d46d197f0fae937" exitCode=0 Mar 21 05:18:37 crc kubenswrapper[4580]: I0321 05:18:37.335988 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lcs4" event={"ID":"c09e7728-e209-4919-b430-5d30477ba537","Type":"ContainerDied","Data":"df017601c65349cbce90658dfcb58cc441adb03e7c35549c0d46d197f0fae937"} Mar 21 05:18:37 crc kubenswrapper[4580]: I0321 05:18:37.336281 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lcs4" event={"ID":"c09e7728-e209-4919-b430-5d30477ba537","Type":"ContainerStarted","Data":"08e7ae4386cee4c2d91c52c9458ac3885aabcfa8cd9c595d27bb1127eadb062b"} Mar 21 05:18:38 crc kubenswrapper[4580]: I0321 05:18:38.349325 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lcs4" event={"ID":"c09e7728-e209-4919-b430-5d30477ba537","Type":"ContainerStarted","Data":"c2df56cbe5e2ceff03ae378fc10a05f3d2d92fbbe02ffb566393d37f21102f5e"} Mar 21 05:18:41 crc kubenswrapper[4580]: I0321 05:18:41.380533 4580 generic.go:334] "Generic (PLEG): container finished" podID="c09e7728-e209-4919-b430-5d30477ba537" containerID="c2df56cbe5e2ceff03ae378fc10a05f3d2d92fbbe02ffb566393d37f21102f5e" exitCode=0 Mar 21 05:18:41 crc kubenswrapper[4580]: I0321 05:18:41.380687 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lcs4" event={"ID":"c09e7728-e209-4919-b430-5d30477ba537","Type":"ContainerDied","Data":"c2df56cbe5e2ceff03ae378fc10a05f3d2d92fbbe02ffb566393d37f21102f5e"} Mar 21 05:18:42 crc kubenswrapper[4580]: I0321 05:18:42.602435 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cd9bffc9-mrwrr" Mar 21 05:18:42 crc kubenswrapper[4580]: I0321 05:18:42.659032 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-g5bkj"] Mar 21 05:18:42 crc kubenswrapper[4580]: I0321 05:18:42.659314 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" podUID="4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" containerName="dnsmasq-dns" containerID="cri-o://9b5b5712f7779b3901e848f165f14a894ec4b2f795609dfd4ca432d624fa5f82" gracePeriod=10 Mar 21 05:18:43 crc kubenswrapper[4580]: I0321 05:18:43.400071 4580 generic.go:334] "Generic (PLEG): container finished" podID="4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" containerID="9b5b5712f7779b3901e848f165f14a894ec4b2f795609dfd4ca432d624fa5f82" exitCode=0 Mar 21 05:18:43 crc kubenswrapper[4580]: I0321 05:18:43.400114 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" event={"ID":"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d","Type":"ContainerDied","Data":"9b5b5712f7779b3901e848f165f14a894ec4b2f795609dfd4ca432d624fa5f82"} Mar 21 05:18:43 crc kubenswrapper[4580]: I0321 05:18:43.403311 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lcs4" event={"ID":"c09e7728-e209-4919-b430-5d30477ba537","Type":"ContainerStarted","Data":"70ce07e7fc38c88dfec2f6d6d7dab34ec80d1038e934a9d2728a19393ef23e08"} Mar 21 05:18:43 crc kubenswrapper[4580]: I0321 05:18:43.424860 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7lcs4" podStartSLOduration=3.632014457 podStartE2EDuration="8.424836668s" podCreationTimestamp="2026-03-21 05:18:35 +0000 UTC" firstStartedPulling="2026-03-21 05:18:37.33791703 +0000 UTC m=+1622.420500658" lastFinishedPulling="2026-03-21 05:18:42.130739241 +0000 UTC m=+1627.213322869" observedRunningTime="2026-03-21 05:18:43.419032133 +0000 UTC m=+1628.501615761" watchObservedRunningTime="2026-03-21 05:18:43.424836668 +0000 UTC m=+1628.507420296" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.317977 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.338940 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-ovsdbserver-nb\") pod \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.339005 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5677\" (UniqueName: \"kubernetes.io/projected/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-kube-api-access-h5677\") pod \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.339051 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-dns-svc\") pod \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.339079 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-config\") pod \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.339127 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-ovsdbserver-sb\") pod \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.339234 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-openstack-edpm-ipam\") pod \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.339266 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-dns-swift-storage-0\") pod \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\" (UID: \"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d\") " Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.387768 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-kube-api-access-h5677" (OuterVolumeSpecName: "kube-api-access-h5677") pod "4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" (UID: "4f8fe598-30d1-49f7-bcf2-a1bd06e4459d"). InnerVolumeSpecName "kube-api-access-h5677". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.447084 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5677\" (UniqueName: \"kubernetes.io/projected/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-kube-api-access-h5677\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.447915 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" event={"ID":"4f8fe598-30d1-49f7-bcf2-a1bd06e4459d","Type":"ContainerDied","Data":"5bd0fedec3f1664010577a9d8f6fc19356113962febd3ac312f9c1f58e12f8cf"} Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.447965 4580 scope.go:117] "RemoveContainer" containerID="9b5b5712f7779b3901e848f165f14a894ec4b2f795609dfd4ca432d624fa5f82" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.448098 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-g5bkj" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.485300 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" (UID: "4f8fe598-30d1-49f7-bcf2-a1bd06e4459d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.497366 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" (UID: "4f8fe598-30d1-49f7-bcf2-a1bd06e4459d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.516568 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-config" (OuterVolumeSpecName: "config") pod "4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" (UID: "4f8fe598-30d1-49f7-bcf2-a1bd06e4459d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.538449 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" (UID: "4f8fe598-30d1-49f7-bcf2-a1bd06e4459d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.551346 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.551387 4580 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.551401 4580 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.551414 4580 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.592798 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" (UID: "4f8fe598-30d1-49f7-bcf2-a1bd06e4459d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.601177 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" (UID: "4f8fe598-30d1-49f7-bcf2-a1bd06e4459d"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.614687 4580 scope.go:117] "RemoveContainer" containerID="2eab12fc51c7687047d7574bae7497545b1a4920164f7a8c280ddef9a85dae0a" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.653080 4580 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.653117 4580 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.787725 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-g5bkj"] Mar 21 05:18:44 crc kubenswrapper[4580]: I0321 05:18:44.799476 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-g5bkj"] Mar 21 05:18:45 crc kubenswrapper[4580]: I0321 05:18:45.630950 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" path="/var/lib/kubelet/pods/4f8fe598-30d1-49f7-bcf2-a1bd06e4459d/volumes" Mar 21 05:18:46 crc kubenswrapper[4580]: I0321 05:18:46.097112 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:46 crc kubenswrapper[4580]: I0321 05:18:46.097362 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:46 crc kubenswrapper[4580]: I0321 05:18:46.149344 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:53 crc kubenswrapper[4580]: I0321 05:18:53.530290 4580 generic.go:334] "Generic (PLEG): container finished" podID="364da597-ba18-4d63-b1be-1d925e603515" containerID="a0b281a901aee720ef6814669d4b7b8ab85d7d623a6e9299fae21611796c03e9" exitCode=0 Mar 21 05:18:53 crc kubenswrapper[4580]: I0321 05:18:53.530362 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"364da597-ba18-4d63-b1be-1d925e603515","Type":"ContainerDied","Data":"a0b281a901aee720ef6814669d4b7b8ab85d7d623a6e9299fae21611796c03e9"} Mar 21 05:18:54 crc kubenswrapper[4580]: I0321 05:18:54.545045 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"364da597-ba18-4d63-b1be-1d925e603515","Type":"ContainerStarted","Data":"f7fc03132d14f073f85b3a1b7b463aae69f35c10e1d014fce1f87174f4900065"} Mar 21 05:18:54 crc kubenswrapper[4580]: I0321 05:18:54.545987 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 21 05:18:54 crc kubenswrapper[4580]: I0321 05:18:54.549372 4580 generic.go:334] "Generic (PLEG): container finished" podID="7619a3e5-e696-412d-8550-c8c30660eacd" containerID="64e483f6fa7300712b22b13ab022b6111574e9a009507ddf05d874335bff7ecc" exitCode=0 Mar 21 05:18:54 crc kubenswrapper[4580]: I0321 05:18:54.549420 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7619a3e5-e696-412d-8550-c8c30660eacd","Type":"ContainerDied","Data":"64e483f6fa7300712b22b13ab022b6111574e9a009507ddf05d874335bff7ecc"} Mar 21 05:18:54 crc kubenswrapper[4580]: I0321 05:18:54.574854 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.574838678 podStartE2EDuration="36.574838678s" podCreationTimestamp="2026-03-21 05:18:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:18:54.574650113 +0000 UTC m=+1639.657233751" watchObservedRunningTime="2026-03-21 05:18:54.574838678 +0000 UTC m=+1639.657422306" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.167110 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h"] Mar 21 05:18:55 crc kubenswrapper[4580]: E0321 05:18:55.167760 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" containerName="init" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.167772 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" containerName="init" Mar 21 05:18:55 crc kubenswrapper[4580]: E0321 05:18:55.167821 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" containerName="dnsmasq-dns" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.167827 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" containerName="dnsmasq-dns" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.167999 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f8fe598-30d1-49f7-bcf2-a1bd06e4459d" containerName="dnsmasq-dns" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.168653 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.171524 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.171709 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.171797 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.171709 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.224629 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h"] Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.257582 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9kvv\" (UniqueName: \"kubernetes.io/projected/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-kube-api-access-k9kvv\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.257641 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.257666 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.257867 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.359647 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.359790 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9kvv\" (UniqueName: \"kubernetes.io/projected/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-kube-api-access-k9kvv\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.359839 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.359868 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.363774 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.365588 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.375874 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.392588 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9kvv\" (UniqueName: \"kubernetes.io/projected/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-kube-api-access-k9kvv\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.487434 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.562019 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7619a3e5-e696-412d-8550-c8c30660eacd","Type":"ContainerStarted","Data":"374b81f958b647d35cc95f32f59c7cc687807d9cad54e55f72a3b4787730a9f2"} Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.562962 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:18:55 crc kubenswrapper[4580]: I0321 05:18:55.596618 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.596597955 podStartE2EDuration="36.596597955s" podCreationTimestamp="2026-03-21 05:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:18:55.58553417 +0000 UTC m=+1640.668117818" watchObservedRunningTime="2026-03-21 05:18:55.596597955 +0000 UTC m=+1640.679181583" Mar 21 05:18:56 crc kubenswrapper[4580]: I0321 05:18:56.148525 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:56 crc kubenswrapper[4580]: I0321 05:18:56.207825 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7lcs4"] Mar 21 05:18:56 crc kubenswrapper[4580]: I0321 05:18:56.569618 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7lcs4" podUID="c09e7728-e209-4919-b430-5d30477ba537" containerName="registry-server" containerID="cri-o://70ce07e7fc38c88dfec2f6d6d7dab34ec80d1038e934a9d2728a19393ef23e08" gracePeriod=2 Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.022706 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h"] Mar 21 05:18:57 crc kubenswrapper[4580]: W0321 05:18:57.036084 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod668fb1a4_aaf2_4d27_ab22_f7d0789f7cf2.slice/crio-ce9b83377d1d6d5908225997f29ba58af6b8537d82aeb74d4f0bfe2ca666dc52 WatchSource:0}: Error finding container ce9b83377d1d6d5908225997f29ba58af6b8537d82aeb74d4f0bfe2ca666dc52: Status 404 returned error can't find the container with id ce9b83377d1d6d5908225997f29ba58af6b8537d82aeb74d4f0bfe2ca666dc52 Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.125020 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.192483 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq5zx\" (UniqueName: \"kubernetes.io/projected/c09e7728-e209-4919-b430-5d30477ba537-kube-api-access-gq5zx\") pod \"c09e7728-e209-4919-b430-5d30477ba537\" (UID: \"c09e7728-e209-4919-b430-5d30477ba537\") " Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.192672 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c09e7728-e209-4919-b430-5d30477ba537-utilities\") pod \"c09e7728-e209-4919-b430-5d30477ba537\" (UID: \"c09e7728-e209-4919-b430-5d30477ba537\") " Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.192923 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c09e7728-e209-4919-b430-5d30477ba537-catalog-content\") pod \"c09e7728-e209-4919-b430-5d30477ba537\" (UID: \"c09e7728-e209-4919-b430-5d30477ba537\") " Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.194327 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c09e7728-e209-4919-b430-5d30477ba537-utilities" (OuterVolumeSpecName: "utilities") pod "c09e7728-e209-4919-b430-5d30477ba537" (UID: "c09e7728-e209-4919-b430-5d30477ba537"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.200225 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c09e7728-e209-4919-b430-5d30477ba537-kube-api-access-gq5zx" (OuterVolumeSpecName: "kube-api-access-gq5zx") pod "c09e7728-e209-4919-b430-5d30477ba537" (UID: "c09e7728-e209-4919-b430-5d30477ba537"). InnerVolumeSpecName "kube-api-access-gq5zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.295450 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq5zx\" (UniqueName: \"kubernetes.io/projected/c09e7728-e209-4919-b430-5d30477ba537-kube-api-access-gq5zx\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.295988 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c09e7728-e209-4919-b430-5d30477ba537-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.318795 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c09e7728-e209-4919-b430-5d30477ba537-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c09e7728-e209-4919-b430-5d30477ba537" (UID: "c09e7728-e209-4919-b430-5d30477ba537"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.397862 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c09e7728-e209-4919-b430-5d30477ba537-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.580627 4580 generic.go:334] "Generic (PLEG): container finished" podID="c09e7728-e209-4919-b430-5d30477ba537" containerID="70ce07e7fc38c88dfec2f6d6d7dab34ec80d1038e934a9d2728a19393ef23e08" exitCode=0 Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.580679 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lcs4" event={"ID":"c09e7728-e209-4919-b430-5d30477ba537","Type":"ContainerDied","Data":"70ce07e7fc38c88dfec2f6d6d7dab34ec80d1038e934a9d2728a19393ef23e08"} Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.580705 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lcs4" event={"ID":"c09e7728-e209-4919-b430-5d30477ba537","Type":"ContainerDied","Data":"08e7ae4386cee4c2d91c52c9458ac3885aabcfa8cd9c595d27bb1127eadb062b"} Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.580723 4580 scope.go:117] "RemoveContainer" containerID="70ce07e7fc38c88dfec2f6d6d7dab34ec80d1038e934a9d2728a19393ef23e08" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.580938 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lcs4" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.592238 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" event={"ID":"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2","Type":"ContainerStarted","Data":"ce9b83377d1d6d5908225997f29ba58af6b8537d82aeb74d4f0bfe2ca666dc52"} Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.623490 4580 scope.go:117] "RemoveContainer" containerID="c2df56cbe5e2ceff03ae378fc10a05f3d2d92fbbe02ffb566393d37f21102f5e" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.632363 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7lcs4"] Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.637333 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7lcs4"] Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.655147 4580 scope.go:117] "RemoveContainer" containerID="df017601c65349cbce90658dfcb58cc441adb03e7c35549c0d46d197f0fae937" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.716231 4580 scope.go:117] "RemoveContainer" containerID="70ce07e7fc38c88dfec2f6d6d7dab34ec80d1038e934a9d2728a19393ef23e08" Mar 21 05:18:57 crc kubenswrapper[4580]: E0321 05:18:57.736175 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ce07e7fc38c88dfec2f6d6d7dab34ec80d1038e934a9d2728a19393ef23e08\": container with ID starting with 70ce07e7fc38c88dfec2f6d6d7dab34ec80d1038e934a9d2728a19393ef23e08 not found: ID does not exist" containerID="70ce07e7fc38c88dfec2f6d6d7dab34ec80d1038e934a9d2728a19393ef23e08" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.736228 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ce07e7fc38c88dfec2f6d6d7dab34ec80d1038e934a9d2728a19393ef23e08"} err="failed to get container status \"70ce07e7fc38c88dfec2f6d6d7dab34ec80d1038e934a9d2728a19393ef23e08\": rpc error: code = NotFound desc = could not find container \"70ce07e7fc38c88dfec2f6d6d7dab34ec80d1038e934a9d2728a19393ef23e08\": container with ID starting with 70ce07e7fc38c88dfec2f6d6d7dab34ec80d1038e934a9d2728a19393ef23e08 not found: ID does not exist" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.736271 4580 scope.go:117] "RemoveContainer" containerID="c2df56cbe5e2ceff03ae378fc10a05f3d2d92fbbe02ffb566393d37f21102f5e" Mar 21 05:18:57 crc kubenswrapper[4580]: E0321 05:18:57.736741 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2df56cbe5e2ceff03ae378fc10a05f3d2d92fbbe02ffb566393d37f21102f5e\": container with ID starting with c2df56cbe5e2ceff03ae378fc10a05f3d2d92fbbe02ffb566393d37f21102f5e not found: ID does not exist" containerID="c2df56cbe5e2ceff03ae378fc10a05f3d2d92fbbe02ffb566393d37f21102f5e" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.736858 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2df56cbe5e2ceff03ae378fc10a05f3d2d92fbbe02ffb566393d37f21102f5e"} err="failed to get container status \"c2df56cbe5e2ceff03ae378fc10a05f3d2d92fbbe02ffb566393d37f21102f5e\": rpc error: code = NotFound desc = could not find container \"c2df56cbe5e2ceff03ae378fc10a05f3d2d92fbbe02ffb566393d37f21102f5e\": container with ID starting with c2df56cbe5e2ceff03ae378fc10a05f3d2d92fbbe02ffb566393d37f21102f5e not found: ID does not exist" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.736875 4580 scope.go:117] "RemoveContainer" containerID="df017601c65349cbce90658dfcb58cc441adb03e7c35549c0d46d197f0fae937" Mar 21 05:18:57 crc kubenswrapper[4580]: E0321 05:18:57.737292 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df017601c65349cbce90658dfcb58cc441adb03e7c35549c0d46d197f0fae937\": container with ID starting with df017601c65349cbce90658dfcb58cc441adb03e7c35549c0d46d197f0fae937 not found: ID does not exist" containerID="df017601c65349cbce90658dfcb58cc441adb03e7c35549c0d46d197f0fae937" Mar 21 05:18:57 crc kubenswrapper[4580]: I0321 05:18:57.737312 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df017601c65349cbce90658dfcb58cc441adb03e7c35549c0d46d197f0fae937"} err="failed to get container status \"df017601c65349cbce90658dfcb58cc441adb03e7c35549c0d46d197f0fae937\": rpc error: code = NotFound desc = could not find container \"df017601c65349cbce90658dfcb58cc441adb03e7c35549c0d46d197f0fae937\": container with ID starting with df017601c65349cbce90658dfcb58cc441adb03e7c35549c0d46d197f0fae937 not found: ID does not exist" Mar 21 05:18:59 crc kubenswrapper[4580]: I0321 05:18:59.629909 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c09e7728-e209-4919-b430-5d30477ba537" path="/var/lib/kubelet/pods/c09e7728-e209-4919-b430-5d30477ba537/volumes" Mar 21 05:19:08 crc kubenswrapper[4580]: I0321 05:19:08.489973 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 21 05:19:08 crc kubenswrapper[4580]: I0321 05:19:08.762581 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" event={"ID":"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2","Type":"ContainerStarted","Data":"d037a25c0ffb829e67e706f430581454ef264feb29222ad856ec568874b1bd94"} Mar 21 05:19:08 crc kubenswrapper[4580]: I0321 05:19:08.782106 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" podStartSLOduration=2.510505073 podStartE2EDuration="13.782087124s" podCreationTimestamp="2026-03-21 05:18:55 +0000 UTC" firstStartedPulling="2026-03-21 05:18:57.043516284 +0000 UTC m=+1642.126099912" lastFinishedPulling="2026-03-21 05:19:08.315098335 +0000 UTC m=+1653.397681963" observedRunningTime="2026-03-21 05:19:08.782021892 +0000 UTC m=+1653.864605520" watchObservedRunningTime="2026-03-21 05:19:08.782087124 +0000 UTC m=+1653.864670752" Mar 21 05:19:09 crc kubenswrapper[4580]: I0321 05:19:09.772483 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:19:09 crc kubenswrapper[4580]: I0321 05:19:09.928061 4580 scope.go:117] "RemoveContainer" containerID="d0014e99e2f2d74cb5b862030bdc3f140458004263e57eda6acee87acd800e7c" Mar 21 05:19:09 crc kubenswrapper[4580]: I0321 05:19:09.971585 4580 scope.go:117] "RemoveContainer" containerID="5cff183c9eab864165e01745bb922f30eb4c972ab49659f258461f0209a9ced5" Mar 21 05:19:10 crc kubenswrapper[4580]: I0321 05:19:10.089383 4580 scope.go:117] "RemoveContainer" containerID="4d38eef5a720f735d000140426f0b792c62476df0bed0a49cfb7c06e936f571f" Mar 21 05:19:10 crc kubenswrapper[4580]: I0321 05:19:10.123723 4580 scope.go:117] "RemoveContainer" containerID="87e95c71e5b4c69e0f416f42c75f627726c658bcb7ae591e1214f6008e48d864" Mar 21 05:19:15 crc kubenswrapper[4580]: I0321 05:19:15.813204 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4xbmm"] Mar 21 05:19:15 crc kubenswrapper[4580]: E0321 05:19:15.814231 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09e7728-e209-4919-b430-5d30477ba537" containerName="extract-utilities" Mar 21 05:19:15 crc kubenswrapper[4580]: I0321 05:19:15.814248 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09e7728-e209-4919-b430-5d30477ba537" containerName="extract-utilities" Mar 21 05:19:15 crc kubenswrapper[4580]: E0321 05:19:15.814289 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09e7728-e209-4919-b430-5d30477ba537" containerName="registry-server" Mar 21 05:19:15 crc kubenswrapper[4580]: I0321 05:19:15.814300 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09e7728-e209-4919-b430-5d30477ba537" containerName="registry-server" Mar 21 05:19:15 crc kubenswrapper[4580]: E0321 05:19:15.814323 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09e7728-e209-4919-b430-5d30477ba537" containerName="extract-content" Mar 21 05:19:15 crc kubenswrapper[4580]: I0321 05:19:15.814331 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09e7728-e209-4919-b430-5d30477ba537" containerName="extract-content" Mar 21 05:19:15 crc kubenswrapper[4580]: I0321 05:19:15.814544 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09e7728-e209-4919-b430-5d30477ba537" containerName="registry-server" Mar 21 05:19:15 crc kubenswrapper[4580]: I0321 05:19:15.817334 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:15 crc kubenswrapper[4580]: I0321 05:19:15.845972 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xbmm"] Mar 21 05:19:16 crc kubenswrapper[4580]: I0321 05:19:16.001169 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe06f798-599a-4a42-9b4d-feded74e5ba7-utilities\") pod \"redhat-marketplace-4xbmm\" (UID: \"fe06f798-599a-4a42-9b4d-feded74e5ba7\") " pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:16 crc kubenswrapper[4580]: I0321 05:19:16.001277 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe06f798-599a-4a42-9b4d-feded74e5ba7-catalog-content\") pod \"redhat-marketplace-4xbmm\" (UID: \"fe06f798-599a-4a42-9b4d-feded74e5ba7\") " pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:16 crc kubenswrapper[4580]: I0321 05:19:16.001327 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk8s9\" (UniqueName: \"kubernetes.io/projected/fe06f798-599a-4a42-9b4d-feded74e5ba7-kube-api-access-pk8s9\") pod \"redhat-marketplace-4xbmm\" (UID: \"fe06f798-599a-4a42-9b4d-feded74e5ba7\") " pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:16 crc kubenswrapper[4580]: I0321 05:19:16.103496 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe06f798-599a-4a42-9b4d-feded74e5ba7-utilities\") pod \"redhat-marketplace-4xbmm\" (UID: \"fe06f798-599a-4a42-9b4d-feded74e5ba7\") " pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:16 crc kubenswrapper[4580]: I0321 05:19:16.103612 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe06f798-599a-4a42-9b4d-feded74e5ba7-catalog-content\") pod \"redhat-marketplace-4xbmm\" (UID: \"fe06f798-599a-4a42-9b4d-feded74e5ba7\") " pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:16 crc kubenswrapper[4580]: I0321 05:19:16.103656 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk8s9\" (UniqueName: \"kubernetes.io/projected/fe06f798-599a-4a42-9b4d-feded74e5ba7-kube-api-access-pk8s9\") pod \"redhat-marketplace-4xbmm\" (UID: \"fe06f798-599a-4a42-9b4d-feded74e5ba7\") " pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:16 crc kubenswrapper[4580]: I0321 05:19:16.104221 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe06f798-599a-4a42-9b4d-feded74e5ba7-utilities\") pod \"redhat-marketplace-4xbmm\" (UID: \"fe06f798-599a-4a42-9b4d-feded74e5ba7\") " pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:16 crc kubenswrapper[4580]: I0321 05:19:16.104313 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe06f798-599a-4a42-9b4d-feded74e5ba7-catalog-content\") pod \"redhat-marketplace-4xbmm\" (UID: \"fe06f798-599a-4a42-9b4d-feded74e5ba7\") " pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:16 crc kubenswrapper[4580]: I0321 05:19:16.132090 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk8s9\" (UniqueName: \"kubernetes.io/projected/fe06f798-599a-4a42-9b4d-feded74e5ba7-kube-api-access-pk8s9\") pod \"redhat-marketplace-4xbmm\" (UID: \"fe06f798-599a-4a42-9b4d-feded74e5ba7\") " pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:16 crc kubenswrapper[4580]: I0321 05:19:16.136854 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:16 crc kubenswrapper[4580]: I0321 05:19:16.604583 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xbmm"] Mar 21 05:19:16 crc kubenswrapper[4580]: I0321 05:19:16.860153 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbmm" event={"ID":"fe06f798-599a-4a42-9b4d-feded74e5ba7","Type":"ContainerStarted","Data":"1d193f9605c19ce8cb619f0067abcf0d944e1a83108227880aa3e110f26a2ab2"} Mar 21 05:19:16 crc kubenswrapper[4580]: I0321 05:19:16.860489 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbmm" event={"ID":"fe06f798-599a-4a42-9b4d-feded74e5ba7","Type":"ContainerStarted","Data":"497b66ab50307338aea4cde56a8ce52da020b279a8f05e8c78b29315fc2bb0c4"} Mar 21 05:19:17 crc kubenswrapper[4580]: I0321 05:19:17.876372 4580 generic.go:334] "Generic (PLEG): container finished" podID="fe06f798-599a-4a42-9b4d-feded74e5ba7" containerID="1d193f9605c19ce8cb619f0067abcf0d944e1a83108227880aa3e110f26a2ab2" exitCode=0 Mar 21 05:19:17 crc kubenswrapper[4580]: I0321 05:19:17.876421 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbmm" event={"ID":"fe06f798-599a-4a42-9b4d-feded74e5ba7","Type":"ContainerDied","Data":"1d193f9605c19ce8cb619f0067abcf0d944e1a83108227880aa3e110f26a2ab2"} Mar 21 05:19:17 crc kubenswrapper[4580]: I0321 05:19:17.876723 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbmm" event={"ID":"fe06f798-599a-4a42-9b4d-feded74e5ba7","Type":"ContainerStarted","Data":"01417d25b05c71a5494d2319d03c669d9a3d5062e914df87ef181b6930dc67ab"} Mar 21 05:19:20 crc kubenswrapper[4580]: I0321 05:19:20.904549 4580 generic.go:334] "Generic (PLEG): container finished" podID="fe06f798-599a-4a42-9b4d-feded74e5ba7" containerID="01417d25b05c71a5494d2319d03c669d9a3d5062e914df87ef181b6930dc67ab" exitCode=0 Mar 21 05:19:20 crc kubenswrapper[4580]: I0321 05:19:20.904635 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbmm" event={"ID":"fe06f798-599a-4a42-9b4d-feded74e5ba7","Type":"ContainerDied","Data":"01417d25b05c71a5494d2319d03c669d9a3d5062e914df87ef181b6930dc67ab"} Mar 21 05:19:21 crc kubenswrapper[4580]: I0321 05:19:21.915440 4580 generic.go:334] "Generic (PLEG): container finished" podID="668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2" containerID="d037a25c0ffb829e67e706f430581454ef264feb29222ad856ec568874b1bd94" exitCode=0 Mar 21 05:19:21 crc kubenswrapper[4580]: I0321 05:19:21.915775 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" event={"ID":"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2","Type":"ContainerDied","Data":"d037a25c0ffb829e67e706f430581454ef264feb29222ad856ec568874b1bd94"} Mar 21 05:19:21 crc kubenswrapper[4580]: I0321 05:19:21.928994 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbmm" event={"ID":"fe06f798-599a-4a42-9b4d-feded74e5ba7","Type":"ContainerStarted","Data":"abb47cd7680c3543059eba6d18aa3ed5be893eff901afa742cdc987c48393543"} Mar 21 05:19:21 crc kubenswrapper[4580]: I0321 05:19:21.965352 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4xbmm" podStartSLOduration=2.492966095 podStartE2EDuration="6.965326253s" podCreationTimestamp="2026-03-21 05:19:15 +0000 UTC" firstStartedPulling="2026-03-21 05:19:16.863583159 +0000 UTC m=+1661.946166787" lastFinishedPulling="2026-03-21 05:19:21.335943317 +0000 UTC m=+1666.418526945" observedRunningTime="2026-03-21 05:19:21.962351774 +0000 UTC m=+1667.044935412" watchObservedRunningTime="2026-03-21 05:19:21.965326253 +0000 UTC m=+1667.047909881" Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.366162 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.556620 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-ssh-key-openstack-edpm-ipam\") pod \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.556706 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9kvv\" (UniqueName: \"kubernetes.io/projected/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-kube-api-access-k9kvv\") pod \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.556814 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-repo-setup-combined-ca-bundle\") pod \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.556912 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-inventory\") pod \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\" (UID: \"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2\") " Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.562947 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-kube-api-access-k9kvv" (OuterVolumeSpecName: "kube-api-access-k9kvv") pod "668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2" (UID: "668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2"). InnerVolumeSpecName "kube-api-access-k9kvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.579674 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2" (UID: "668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.584623 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2" (UID: "668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.587933 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-inventory" (OuterVolumeSpecName: "inventory") pod "668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2" (UID: "668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.660352 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.660389 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9kvv\" (UniqueName: \"kubernetes.io/projected/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-kube-api-access-k9kvv\") on node \"crc\" DevicePath \"\"" Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.660399 4580 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.660422 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.955314 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" event={"ID":"668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2","Type":"ContainerDied","Data":"ce9b83377d1d6d5908225997f29ba58af6b8537d82aeb74d4f0bfe2ca666dc52"} Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.955395 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce9b83377d1d6d5908225997f29ba58af6b8537d82aeb74d4f0bfe2ca666dc52" Mar 21 05:19:23 crc kubenswrapper[4580]: I0321 05:19:23.955636 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.119613 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx"] Mar 21 05:19:24 crc kubenswrapper[4580]: E0321 05:19:24.120276 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.120300 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.120518 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.121548 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.124479 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.124890 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.127234 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.127485 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.136326 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx"] Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.172548 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/251c60b9-f972-4aec-85af-f00d48e21662-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dp8qx\" (UID: \"251c60b9-f972-4aec-85af-f00d48e21662\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.172610 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxm9x\" (UniqueName: \"kubernetes.io/projected/251c60b9-f972-4aec-85af-f00d48e21662-kube-api-access-sxm9x\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dp8qx\" (UID: \"251c60b9-f972-4aec-85af-f00d48e21662\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.172712 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/251c60b9-f972-4aec-85af-f00d48e21662-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dp8qx\" (UID: \"251c60b9-f972-4aec-85af-f00d48e21662\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.274295 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/251c60b9-f972-4aec-85af-f00d48e21662-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dp8qx\" (UID: \"251c60b9-f972-4aec-85af-f00d48e21662\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.274952 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/251c60b9-f972-4aec-85af-f00d48e21662-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dp8qx\" (UID: \"251c60b9-f972-4aec-85af-f00d48e21662\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.274990 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxm9x\" (UniqueName: \"kubernetes.io/projected/251c60b9-f972-4aec-85af-f00d48e21662-kube-api-access-sxm9x\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dp8qx\" (UID: \"251c60b9-f972-4aec-85af-f00d48e21662\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.286649 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/251c60b9-f972-4aec-85af-f00d48e21662-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dp8qx\" (UID: \"251c60b9-f972-4aec-85af-f00d48e21662\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.287067 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/251c60b9-f972-4aec-85af-f00d48e21662-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dp8qx\" (UID: \"251c60b9-f972-4aec-85af-f00d48e21662\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.293479 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxm9x\" (UniqueName: \"kubernetes.io/projected/251c60b9-f972-4aec-85af-f00d48e21662-kube-api-access-sxm9x\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dp8qx\" (UID: \"251c60b9-f972-4aec-85af-f00d48e21662\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.444104 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" Mar 21 05:19:24 crc kubenswrapper[4580]: I0321 05:19:24.984483 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx"] Mar 21 05:19:25 crc kubenswrapper[4580]: I0321 05:19:25.981441 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" event={"ID":"251c60b9-f972-4aec-85af-f00d48e21662","Type":"ContainerStarted","Data":"b93e488a8f06e652f68d4cadca5e144a42cba63bd4edf3a95e1ebbf1b30359d5"} Mar 21 05:19:25 crc kubenswrapper[4580]: I0321 05:19:25.983003 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" event={"ID":"251c60b9-f972-4aec-85af-f00d48e21662","Type":"ContainerStarted","Data":"1c51aa8e45146d86527f59b29faad681e3dd2e728b7810c6e7df51fd8b96e678"} Mar 21 05:19:26 crc kubenswrapper[4580]: I0321 05:19:26.002580 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" podStartSLOduration=1.5373277010000002 podStartE2EDuration="2.002562522s" podCreationTimestamp="2026-03-21 05:19:24 +0000 UTC" firstStartedPulling="2026-03-21 05:19:24.997039018 +0000 UTC m=+1670.079622646" lastFinishedPulling="2026-03-21 05:19:25.462273839 +0000 UTC m=+1670.544857467" observedRunningTime="2026-03-21 05:19:25.999648864 +0000 UTC m=+1671.082232502" watchObservedRunningTime="2026-03-21 05:19:26.002562522 +0000 UTC m=+1671.085146150" Mar 21 05:19:26 crc kubenswrapper[4580]: I0321 05:19:26.138043 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:26 crc kubenswrapper[4580]: I0321 05:19:26.138108 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:26 crc kubenswrapper[4580]: I0321 05:19:26.197288 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:27 crc kubenswrapper[4580]: I0321 05:19:27.044465 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:27 crc kubenswrapper[4580]: I0321 05:19:27.099365 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xbmm"] Mar 21 05:19:29 crc kubenswrapper[4580]: I0321 05:19:29.011430 4580 generic.go:334] "Generic (PLEG): container finished" podID="251c60b9-f972-4aec-85af-f00d48e21662" containerID="b93e488a8f06e652f68d4cadca5e144a42cba63bd4edf3a95e1ebbf1b30359d5" exitCode=0 Mar 21 05:19:29 crc kubenswrapper[4580]: I0321 05:19:29.011874 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4xbmm" podUID="fe06f798-599a-4a42-9b4d-feded74e5ba7" containerName="registry-server" containerID="cri-o://abb47cd7680c3543059eba6d18aa3ed5be893eff901afa742cdc987c48393543" gracePeriod=2 Mar 21 05:19:29 crc kubenswrapper[4580]: I0321 05:19:29.011500 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" event={"ID":"251c60b9-f972-4aec-85af-f00d48e21662","Type":"ContainerDied","Data":"b93e488a8f06e652f68d4cadca5e144a42cba63bd4edf3a95e1ebbf1b30359d5"} Mar 21 05:19:29 crc kubenswrapper[4580]: I0321 05:19:29.504482 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:29 crc kubenswrapper[4580]: I0321 05:19:29.582051 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe06f798-599a-4a42-9b4d-feded74e5ba7-utilities\") pod \"fe06f798-599a-4a42-9b4d-feded74e5ba7\" (UID: \"fe06f798-599a-4a42-9b4d-feded74e5ba7\") " Mar 21 05:19:29 crc kubenswrapper[4580]: I0321 05:19:29.582207 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe06f798-599a-4a42-9b4d-feded74e5ba7-catalog-content\") pod \"fe06f798-599a-4a42-9b4d-feded74e5ba7\" (UID: \"fe06f798-599a-4a42-9b4d-feded74e5ba7\") " Mar 21 05:19:29 crc kubenswrapper[4580]: I0321 05:19:29.583257 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe06f798-599a-4a42-9b4d-feded74e5ba7-utilities" (OuterVolumeSpecName: "utilities") pod "fe06f798-599a-4a42-9b4d-feded74e5ba7" (UID: "fe06f798-599a-4a42-9b4d-feded74e5ba7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:19:29 crc kubenswrapper[4580]: I0321 05:19:29.612815 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe06f798-599a-4a42-9b4d-feded74e5ba7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe06f798-599a-4a42-9b4d-feded74e5ba7" (UID: "fe06f798-599a-4a42-9b4d-feded74e5ba7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:19:29 crc kubenswrapper[4580]: I0321 05:19:29.683535 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk8s9\" (UniqueName: \"kubernetes.io/projected/fe06f798-599a-4a42-9b4d-feded74e5ba7-kube-api-access-pk8s9\") pod \"fe06f798-599a-4a42-9b4d-feded74e5ba7\" (UID: \"fe06f798-599a-4a42-9b4d-feded74e5ba7\") " Mar 21 05:19:29 crc kubenswrapper[4580]: I0321 05:19:29.685356 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe06f798-599a-4a42-9b4d-feded74e5ba7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:19:29 crc kubenswrapper[4580]: I0321 05:19:29.685396 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe06f798-599a-4a42-9b4d-feded74e5ba7-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:19:29 crc kubenswrapper[4580]: I0321 05:19:29.695080 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe06f798-599a-4a42-9b4d-feded74e5ba7-kube-api-access-pk8s9" (OuterVolumeSpecName: "kube-api-access-pk8s9") pod "fe06f798-599a-4a42-9b4d-feded74e5ba7" (UID: "fe06f798-599a-4a42-9b4d-feded74e5ba7"). InnerVolumeSpecName "kube-api-access-pk8s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:19:29 crc kubenswrapper[4580]: I0321 05:19:29.786424 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk8s9\" (UniqueName: \"kubernetes.io/projected/fe06f798-599a-4a42-9b4d-feded74e5ba7-kube-api-access-pk8s9\") on node \"crc\" DevicePath \"\"" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.022075 4580 generic.go:334] "Generic (PLEG): container finished" podID="fe06f798-599a-4a42-9b4d-feded74e5ba7" containerID="abb47cd7680c3543059eba6d18aa3ed5be893eff901afa742cdc987c48393543" exitCode=0 Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.022181 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xbmm" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.022176 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbmm" event={"ID":"fe06f798-599a-4a42-9b4d-feded74e5ba7","Type":"ContainerDied","Data":"abb47cd7680c3543059eba6d18aa3ed5be893eff901afa742cdc987c48393543"} Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.023919 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xbmm" event={"ID":"fe06f798-599a-4a42-9b4d-feded74e5ba7","Type":"ContainerDied","Data":"497b66ab50307338aea4cde56a8ce52da020b279a8f05e8c78b29315fc2bb0c4"} Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.023951 4580 scope.go:117] "RemoveContainer" containerID="abb47cd7680c3543059eba6d18aa3ed5be893eff901afa742cdc987c48393543" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.057833 4580 scope.go:117] "RemoveContainer" containerID="01417d25b05c71a5494d2319d03c669d9a3d5062e914df87ef181b6930dc67ab" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.089378 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xbmm"] Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.100208 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xbmm"] Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.114161 4580 scope.go:117] "RemoveContainer" containerID="1d193f9605c19ce8cb619f0067abcf0d944e1a83108227880aa3e110f26a2ab2" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.142452 4580 scope.go:117] "RemoveContainer" containerID="abb47cd7680c3543059eba6d18aa3ed5be893eff901afa742cdc987c48393543" Mar 21 05:19:30 crc kubenswrapper[4580]: E0321 05:19:30.143144 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb47cd7680c3543059eba6d18aa3ed5be893eff901afa742cdc987c48393543\": container with ID starting with abb47cd7680c3543059eba6d18aa3ed5be893eff901afa742cdc987c48393543 not found: ID does not exist" containerID="abb47cd7680c3543059eba6d18aa3ed5be893eff901afa742cdc987c48393543" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.143184 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb47cd7680c3543059eba6d18aa3ed5be893eff901afa742cdc987c48393543"} err="failed to get container status \"abb47cd7680c3543059eba6d18aa3ed5be893eff901afa742cdc987c48393543\": rpc error: code = NotFound desc = could not find container \"abb47cd7680c3543059eba6d18aa3ed5be893eff901afa742cdc987c48393543\": container with ID starting with abb47cd7680c3543059eba6d18aa3ed5be893eff901afa742cdc987c48393543 not found: ID does not exist" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.143205 4580 scope.go:117] "RemoveContainer" containerID="01417d25b05c71a5494d2319d03c669d9a3d5062e914df87ef181b6930dc67ab" Mar 21 05:19:30 crc kubenswrapper[4580]: E0321 05:19:30.143594 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01417d25b05c71a5494d2319d03c669d9a3d5062e914df87ef181b6930dc67ab\": container with ID starting with 01417d25b05c71a5494d2319d03c669d9a3d5062e914df87ef181b6930dc67ab not found: ID does not exist" containerID="01417d25b05c71a5494d2319d03c669d9a3d5062e914df87ef181b6930dc67ab" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.143640 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01417d25b05c71a5494d2319d03c669d9a3d5062e914df87ef181b6930dc67ab"} err="failed to get container status \"01417d25b05c71a5494d2319d03c669d9a3d5062e914df87ef181b6930dc67ab\": rpc error: code = NotFound desc = could not find container \"01417d25b05c71a5494d2319d03c669d9a3d5062e914df87ef181b6930dc67ab\": container with ID starting with 01417d25b05c71a5494d2319d03c669d9a3d5062e914df87ef181b6930dc67ab not found: ID does not exist" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.143669 4580 scope.go:117] "RemoveContainer" containerID="1d193f9605c19ce8cb619f0067abcf0d944e1a83108227880aa3e110f26a2ab2" Mar 21 05:19:30 crc kubenswrapper[4580]: E0321 05:19:30.144412 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d193f9605c19ce8cb619f0067abcf0d944e1a83108227880aa3e110f26a2ab2\": container with ID starting with 1d193f9605c19ce8cb619f0067abcf0d944e1a83108227880aa3e110f26a2ab2 not found: ID does not exist" containerID="1d193f9605c19ce8cb619f0067abcf0d944e1a83108227880aa3e110f26a2ab2" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.144441 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d193f9605c19ce8cb619f0067abcf0d944e1a83108227880aa3e110f26a2ab2"} err="failed to get container status \"1d193f9605c19ce8cb619f0067abcf0d944e1a83108227880aa3e110f26a2ab2\": rpc error: code = NotFound desc = could not find container \"1d193f9605c19ce8cb619f0067abcf0d944e1a83108227880aa3e110f26a2ab2\": container with ID starting with 1d193f9605c19ce8cb619f0067abcf0d944e1a83108227880aa3e110f26a2ab2 not found: ID does not exist" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.494133 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.499523 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/251c60b9-f972-4aec-85af-f00d48e21662-inventory\") pod \"251c60b9-f972-4aec-85af-f00d48e21662\" (UID: \"251c60b9-f972-4aec-85af-f00d48e21662\") " Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.499603 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/251c60b9-f972-4aec-85af-f00d48e21662-ssh-key-openstack-edpm-ipam\") pod \"251c60b9-f972-4aec-85af-f00d48e21662\" (UID: \"251c60b9-f972-4aec-85af-f00d48e21662\") " Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.499682 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxm9x\" (UniqueName: \"kubernetes.io/projected/251c60b9-f972-4aec-85af-f00d48e21662-kube-api-access-sxm9x\") pod \"251c60b9-f972-4aec-85af-f00d48e21662\" (UID: \"251c60b9-f972-4aec-85af-f00d48e21662\") " Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.506035 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/251c60b9-f972-4aec-85af-f00d48e21662-kube-api-access-sxm9x" (OuterVolumeSpecName: "kube-api-access-sxm9x") pod "251c60b9-f972-4aec-85af-f00d48e21662" (UID: "251c60b9-f972-4aec-85af-f00d48e21662"). InnerVolumeSpecName "kube-api-access-sxm9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.534077 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/251c60b9-f972-4aec-85af-f00d48e21662-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "251c60b9-f972-4aec-85af-f00d48e21662" (UID: "251c60b9-f972-4aec-85af-f00d48e21662"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.551007 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/251c60b9-f972-4aec-85af-f00d48e21662-inventory" (OuterVolumeSpecName: "inventory") pod "251c60b9-f972-4aec-85af-f00d48e21662" (UID: "251c60b9-f972-4aec-85af-f00d48e21662"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.601474 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/251c60b9-f972-4aec-85af-f00d48e21662-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.601513 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/251c60b9-f972-4aec-85af-f00d48e21662-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:19:30 crc kubenswrapper[4580]: I0321 05:19:30.601526 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxm9x\" (UniqueName: \"kubernetes.io/projected/251c60b9-f972-4aec-85af-f00d48e21662-kube-api-access-sxm9x\") on node \"crc\" DevicePath \"\"" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.061957 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" event={"ID":"251c60b9-f972-4aec-85af-f00d48e21662","Type":"ContainerDied","Data":"1c51aa8e45146d86527f59b29faad681e3dd2e728b7810c6e7df51fd8b96e678"} Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.062050 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c51aa8e45146d86527f59b29faad681e3dd2e728b7810c6e7df51fd8b96e678" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.062069 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dp8qx" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.119040 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x"] Mar 21 05:19:31 crc kubenswrapper[4580]: E0321 05:19:31.119760 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="251c60b9-f972-4aec-85af-f00d48e21662" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.119896 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="251c60b9-f972-4aec-85af-f00d48e21662" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 21 05:19:31 crc kubenswrapper[4580]: E0321 05:19:31.120010 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe06f798-599a-4a42-9b4d-feded74e5ba7" containerName="extract-content" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.120104 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe06f798-599a-4a42-9b4d-feded74e5ba7" containerName="extract-content" Mar 21 05:19:31 crc kubenswrapper[4580]: E0321 05:19:31.120199 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe06f798-599a-4a42-9b4d-feded74e5ba7" containerName="registry-server" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.120267 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe06f798-599a-4a42-9b4d-feded74e5ba7" containerName="registry-server" Mar 21 05:19:31 crc kubenswrapper[4580]: E0321 05:19:31.120348 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe06f798-599a-4a42-9b4d-feded74e5ba7" containerName="extract-utilities" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.120418 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe06f798-599a-4a42-9b4d-feded74e5ba7" containerName="extract-utilities" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.120678 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe06f798-599a-4a42-9b4d-feded74e5ba7" containerName="registry-server" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.120760 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="251c60b9-f972-4aec-85af-f00d48e21662" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.121493 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.124297 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.126643 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.126797 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.129344 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.136018 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x"] Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.211801 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.211900 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.212086 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.212215 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mllz2\" (UniqueName: \"kubernetes.io/projected/ddfb2a5d-1386-4dac-aee6-316bce48c76b-kube-api-access-mllz2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.313537 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.313648 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mllz2\" (UniqueName: \"kubernetes.io/projected/ddfb2a5d-1386-4dac-aee6-316bce48c76b-kube-api-access-mllz2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.314211 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.314742 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.318195 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.318681 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.319861 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.332388 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mllz2\" (UniqueName: \"kubernetes.io/projected/ddfb2a5d-1386-4dac-aee6-316bce48c76b-kube-api-access-mllz2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.483081 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:19:31 crc kubenswrapper[4580]: I0321 05:19:31.629996 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe06f798-599a-4a42-9b4d-feded74e5ba7" path="/var/lib/kubelet/pods/fe06f798-599a-4a42-9b4d-feded74e5ba7/volumes" Mar 21 05:19:32 crc kubenswrapper[4580]: I0321 05:19:32.023617 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x"] Mar 21 05:19:32 crc kubenswrapper[4580]: I0321 05:19:32.075725 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" event={"ID":"ddfb2a5d-1386-4dac-aee6-316bce48c76b","Type":"ContainerStarted","Data":"76482ee405395c145cf4a1c8fd8db39b10f291cd8da4060cb0b57dde1184fc9b"} Mar 21 05:19:33 crc kubenswrapper[4580]: I0321 05:19:33.086198 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" event={"ID":"ddfb2a5d-1386-4dac-aee6-316bce48c76b","Type":"ContainerStarted","Data":"da0209769142ac3b2ec0c66fe34ef45ac2bbffd6d067e76fef5e3b7b479f9100"} Mar 21 05:19:33 crc kubenswrapper[4580]: I0321 05:19:33.106083 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" podStartSLOduration=1.615275153 podStartE2EDuration="2.106058244s" podCreationTimestamp="2026-03-21 05:19:31 +0000 UTC" firstStartedPulling="2026-03-21 05:19:32.023458987 +0000 UTC m=+1677.106042615" lastFinishedPulling="2026-03-21 05:19:32.514242078 +0000 UTC m=+1677.596825706" observedRunningTime="2026-03-21 05:19:33.100585529 +0000 UTC m=+1678.183169147" watchObservedRunningTime="2026-03-21 05:19:33.106058244 +0000 UTC m=+1678.188641892" Mar 21 05:20:00 crc kubenswrapper[4580]: I0321 05:20:00.149467 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567840-wstd5"] Mar 21 05:20:00 crc kubenswrapper[4580]: I0321 05:20:00.151520 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-wstd5" Mar 21 05:20:00 crc kubenswrapper[4580]: I0321 05:20:00.154276 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:20:00 crc kubenswrapper[4580]: I0321 05:20:00.154396 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:20:00 crc kubenswrapper[4580]: I0321 05:20:00.156802 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:20:00 crc kubenswrapper[4580]: I0321 05:20:00.161610 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-wstd5"] Mar 21 05:20:00 crc kubenswrapper[4580]: I0321 05:20:00.218342 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh7g9\" (UniqueName: \"kubernetes.io/projected/1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb-kube-api-access-gh7g9\") pod \"auto-csr-approver-29567840-wstd5\" (UID: \"1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb\") " pod="openshift-infra/auto-csr-approver-29567840-wstd5" Mar 21 05:20:00 crc kubenswrapper[4580]: I0321 05:20:00.320408 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh7g9\" (UniqueName: \"kubernetes.io/projected/1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb-kube-api-access-gh7g9\") pod \"auto-csr-approver-29567840-wstd5\" (UID: \"1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb\") " pod="openshift-infra/auto-csr-approver-29567840-wstd5" Mar 21 05:20:00 crc kubenswrapper[4580]: I0321 05:20:00.340589 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh7g9\" (UniqueName: \"kubernetes.io/projected/1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb-kube-api-access-gh7g9\") pod \"auto-csr-approver-29567840-wstd5\" (UID: \"1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb\") " pod="openshift-infra/auto-csr-approver-29567840-wstd5" Mar 21 05:20:00 crc kubenswrapper[4580]: I0321 05:20:00.479593 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-wstd5" Mar 21 05:20:00 crc kubenswrapper[4580]: I0321 05:20:00.966699 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-wstd5"] Mar 21 05:20:01 crc kubenswrapper[4580]: I0321 05:20:01.379987 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567840-wstd5" event={"ID":"1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb","Type":"ContainerStarted","Data":"6826263e05ac6d2137a0ad610495a9a277845a4ce34d6301cab02577baee7af1"} Mar 21 05:20:03 crc kubenswrapper[4580]: I0321 05:20:03.408934 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567840-wstd5" event={"ID":"1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb","Type":"ContainerStarted","Data":"ca3d9cf7e617312d01d2a1e54bb27fc9e109de41500d1f435dc31c73a03ad3fc"} Mar 21 05:20:03 crc kubenswrapper[4580]: I0321 05:20:03.433094 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567840-wstd5" podStartSLOduration=2.353985902 podStartE2EDuration="3.433074007s" podCreationTimestamp="2026-03-21 05:20:00 +0000 UTC" firstStartedPulling="2026-03-21 05:20:00.962303654 +0000 UTC m=+1706.044887282" lastFinishedPulling="2026-03-21 05:20:02.041391759 +0000 UTC m=+1707.123975387" observedRunningTime="2026-03-21 05:20:03.425708321 +0000 UTC m=+1708.508291979" watchObservedRunningTime="2026-03-21 05:20:03.433074007 +0000 UTC m=+1708.515657635" Mar 21 05:20:05 crc kubenswrapper[4580]: I0321 05:20:05.425509 4580 generic.go:334] "Generic (PLEG): container finished" podID="1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb" containerID="ca3d9cf7e617312d01d2a1e54bb27fc9e109de41500d1f435dc31c73a03ad3fc" exitCode=0 Mar 21 05:20:05 crc kubenswrapper[4580]: I0321 05:20:05.425542 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567840-wstd5" event={"ID":"1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb","Type":"ContainerDied","Data":"ca3d9cf7e617312d01d2a1e54bb27fc9e109de41500d1f435dc31c73a03ad3fc"} Mar 21 05:20:06 crc kubenswrapper[4580]: I0321 05:20:06.802111 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-wstd5" Mar 21 05:20:06 crc kubenswrapper[4580]: I0321 05:20:06.857841 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh7g9\" (UniqueName: \"kubernetes.io/projected/1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb-kube-api-access-gh7g9\") pod \"1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb\" (UID: \"1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb\") " Mar 21 05:20:06 crc kubenswrapper[4580]: I0321 05:20:06.868077 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb-kube-api-access-gh7g9" (OuterVolumeSpecName: "kube-api-access-gh7g9") pod "1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb" (UID: "1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb"). InnerVolumeSpecName "kube-api-access-gh7g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:20:06 crc kubenswrapper[4580]: I0321 05:20:06.963214 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh7g9\" (UniqueName: \"kubernetes.io/projected/1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb-kube-api-access-gh7g9\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:07 crc kubenswrapper[4580]: I0321 05:20:07.445429 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567840-wstd5" event={"ID":"1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb","Type":"ContainerDied","Data":"6826263e05ac6d2137a0ad610495a9a277845a4ce34d6301cab02577baee7af1"} Mar 21 05:20:07 crc kubenswrapper[4580]: I0321 05:20:07.446009 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6826263e05ac6d2137a0ad610495a9a277845a4ce34d6301cab02577baee7af1" Mar 21 05:20:07 crc kubenswrapper[4580]: I0321 05:20:07.445582 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-wstd5" Mar 21 05:20:07 crc kubenswrapper[4580]: I0321 05:20:07.524511 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-5xjpq"] Mar 21 05:20:07 crc kubenswrapper[4580]: I0321 05:20:07.537897 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-5xjpq"] Mar 21 05:20:07 crc kubenswrapper[4580]: I0321 05:20:07.627955 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80b82bd4-51bb-4e57-9c87-779dc26bfbf1" path="/var/lib/kubelet/pods/80b82bd4-51bb-4e57-9c87-779dc26bfbf1/volumes" Mar 21 05:20:11 crc kubenswrapper[4580]: I0321 05:20:11.239097 4580 scope.go:117] "RemoveContainer" containerID="d08bb178893463b29248137ba30bca92d626c1f48e945eba88f9d5edd1d723b9" Mar 21 05:20:45 crc kubenswrapper[4580]: I0321 05:20:45.948232 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:20:45 crc kubenswrapper[4580]: I0321 05:20:45.948894 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:21:15 crc kubenswrapper[4580]: I0321 05:21:15.948085 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:21:15 crc kubenswrapper[4580]: I0321 05:21:15.949773 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:21:41 crc kubenswrapper[4580]: I0321 05:21:41.045490 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-64qzs"] Mar 21 05:21:41 crc kubenswrapper[4580]: I0321 05:21:41.062212 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6hzdk"] Mar 21 05:21:41 crc kubenswrapper[4580]: I0321 05:21:41.073098 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-81c4-account-create-update-khzvr"] Mar 21 05:21:41 crc kubenswrapper[4580]: I0321 05:21:41.083342 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-64qzs"] Mar 21 05:21:41 crc kubenswrapper[4580]: I0321 05:21:41.095557 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-81c4-account-create-update-khzvr"] Mar 21 05:21:41 crc kubenswrapper[4580]: I0321 05:21:41.107892 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7442-account-create-update-fqvpj"] Mar 21 05:21:41 crc kubenswrapper[4580]: I0321 05:21:41.119554 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6hzdk"] Mar 21 05:21:41 crc kubenswrapper[4580]: I0321 05:21:41.128125 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7442-account-create-update-fqvpj"] Mar 21 05:21:41 crc kubenswrapper[4580]: I0321 05:21:41.628742 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6afa77b-c41a-42ad-a253-64cf7e6e5544" path="/var/lib/kubelet/pods/b6afa77b-c41a-42ad-a253-64cf7e6e5544/volumes" Mar 21 05:21:41 crc kubenswrapper[4580]: I0321 05:21:41.630488 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf6c0d9f-d17b-458f-a653-713f834d70cc" path="/var/lib/kubelet/pods/bf6c0d9f-d17b-458f-a653-713f834d70cc/volumes" Mar 21 05:21:41 crc kubenswrapper[4580]: I0321 05:21:41.634638 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca4f77fa-08f5-4c3c-9300-112440f9acc1" path="/var/lib/kubelet/pods/ca4f77fa-08f5-4c3c-9300-112440f9acc1/volumes" Mar 21 05:21:41 crc kubenswrapper[4580]: I0321 05:21:41.637482 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f34fd0f6-3c75-42ff-909d-32c6255e5c68" path="/var/lib/kubelet/pods/f34fd0f6-3c75-42ff-909d-32c6255e5c68/volumes" Mar 21 05:21:45 crc kubenswrapper[4580]: I0321 05:21:45.042833 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-mwp8r"] Mar 21 05:21:45 crc kubenswrapper[4580]: I0321 05:21:45.057713 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-mwp8r"] Mar 21 05:21:45 crc kubenswrapper[4580]: I0321 05:21:45.067583 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hnpbd"] Mar 21 05:21:45 crc kubenswrapper[4580]: I0321 05:21:45.078229 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-hnpbd"] Mar 21 05:21:45 crc kubenswrapper[4580]: I0321 05:21:45.086873 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8ced-account-create-update-rvjp7"] Mar 21 05:21:45 crc kubenswrapper[4580]: I0321 05:21:45.095366 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8ced-account-create-update-rvjp7"] Mar 21 05:21:45 crc kubenswrapper[4580]: I0321 05:21:45.630307 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fbe609f-6d91-4f8b-82b9-17a602597351" path="/var/lib/kubelet/pods/5fbe609f-6d91-4f8b-82b9-17a602597351/volumes" Mar 21 05:21:45 crc kubenswrapper[4580]: I0321 05:21:45.633635 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6da615d4-b81c-4ad0-90b2-23e4029c949c" path="/var/lib/kubelet/pods/6da615d4-b81c-4ad0-90b2-23e4029c949c/volumes" Mar 21 05:21:45 crc kubenswrapper[4580]: I0321 05:21:45.639213 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0d1dcc7-e806-4348-94ab-347efc9930b7" path="/var/lib/kubelet/pods/d0d1dcc7-e806-4348-94ab-347efc9930b7/volumes" Mar 21 05:21:45 crc kubenswrapper[4580]: I0321 05:21:45.947412 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:21:45 crc kubenswrapper[4580]: I0321 05:21:45.947469 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:21:45 crc kubenswrapper[4580]: I0321 05:21:45.947510 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 05:21:45 crc kubenswrapper[4580]: I0321 05:21:45.948195 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45"} pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:21:45 crc kubenswrapper[4580]: I0321 05:21:45.948255 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" containerID="cri-o://66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" gracePeriod=600 Mar 21 05:21:46 crc kubenswrapper[4580]: E0321 05:21:46.075738 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:21:46 crc kubenswrapper[4580]: I0321 05:21:46.327965 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" exitCode=0 Mar 21 05:21:46 crc kubenswrapper[4580]: I0321 05:21:46.328032 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerDied","Data":"66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45"} Mar 21 05:21:46 crc kubenswrapper[4580]: I0321 05:21:46.328374 4580 scope.go:117] "RemoveContainer" containerID="b0b67d8190c897455e564af68d56eb7f7f1eabacada737f44e7b09e47464a936" Mar 21 05:21:46 crc kubenswrapper[4580]: I0321 05:21:46.328973 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:21:46 crc kubenswrapper[4580]: E0321 05:21:46.329255 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:21:56 crc kubenswrapper[4580]: I0321 05:21:56.618874 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:21:56 crc kubenswrapper[4580]: E0321 05:21:56.619645 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:21:57 crc kubenswrapper[4580]: I0321 05:21:57.036352 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-smbzn"] Mar 21 05:21:57 crc kubenswrapper[4580]: I0321 05:21:57.052138 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-kkck5"] Mar 21 05:21:57 crc kubenswrapper[4580]: I0321 05:21:57.062712 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-smbzn"] Mar 21 05:21:57 crc kubenswrapper[4580]: I0321 05:21:57.070732 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-38bf-account-create-update-pwp74"] Mar 21 05:21:57 crc kubenswrapper[4580]: I0321 05:21:57.080843 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-kkck5"] Mar 21 05:21:57 crc kubenswrapper[4580]: I0321 05:21:57.086463 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-38bf-account-create-update-pwp74"] Mar 21 05:21:57 crc kubenswrapper[4580]: I0321 05:21:57.649715 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b154885-0b18-4926-b81c-c10208075c27" path="/var/lib/kubelet/pods/3b154885-0b18-4926-b81c-c10208075c27/volumes" Mar 21 05:21:57 crc kubenswrapper[4580]: I0321 05:21:57.652306 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e111d88-8363-418a-aa16-e4738fc5dfa5" path="/var/lib/kubelet/pods/8e111d88-8363-418a-aa16-e4738fc5dfa5/volumes" Mar 21 05:21:57 crc kubenswrapper[4580]: I0321 05:21:57.674242 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949065da-d504-4d31-800b-cfb6b82bb559" path="/var/lib/kubelet/pods/949065da-d504-4d31-800b-cfb6b82bb559/volumes" Mar 21 05:21:58 crc kubenswrapper[4580]: I0321 05:21:58.027281 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-26d4d"] Mar 21 05:21:58 crc kubenswrapper[4580]: I0321 05:21:58.034595 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0055-account-create-update-jxb6l"] Mar 21 05:21:58 crc kubenswrapper[4580]: I0321 05:21:58.042113 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-01e5-account-create-update-ptjxt"] Mar 21 05:21:58 crc kubenswrapper[4580]: I0321 05:21:58.053884 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0055-account-create-update-jxb6l"] Mar 21 05:21:58 crc kubenswrapper[4580]: I0321 05:21:58.063272 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-26d4d"] Mar 21 05:21:58 crc kubenswrapper[4580]: I0321 05:21:58.071361 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-01e5-account-create-update-ptjxt"] Mar 21 05:21:59 crc kubenswrapper[4580]: I0321 05:21:59.636005 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ec630d4-9254-4a93-b61f-d960ac2b3ccc" path="/var/lib/kubelet/pods/1ec630d4-9254-4a93-b61f-d960ac2b3ccc/volumes" Mar 21 05:21:59 crc kubenswrapper[4580]: I0321 05:21:59.639938 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="486431e9-d00c-4a61-b831-619c73ef470f" path="/var/lib/kubelet/pods/486431e9-d00c-4a61-b831-619c73ef470f/volumes" Mar 21 05:21:59 crc kubenswrapper[4580]: I0321 05:21:59.643761 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33" path="/var/lib/kubelet/pods/8f99d1ec-af0a-4cc2-bfb3-86b618a2fd33/volumes" Mar 21 05:22:00 crc kubenswrapper[4580]: I0321 05:22:00.142436 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567842-7t89q"] Mar 21 05:22:00 crc kubenswrapper[4580]: E0321 05:22:00.142932 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb" containerName="oc" Mar 21 05:22:00 crc kubenswrapper[4580]: I0321 05:22:00.142954 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb" containerName="oc" Mar 21 05:22:00 crc kubenswrapper[4580]: I0321 05:22:00.143196 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb" containerName="oc" Mar 21 05:22:00 crc kubenswrapper[4580]: I0321 05:22:00.144218 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-7t89q" Mar 21 05:22:00 crc kubenswrapper[4580]: I0321 05:22:00.153770 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-7t89q"] Mar 21 05:22:00 crc kubenswrapper[4580]: I0321 05:22:00.155903 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:22:00 crc kubenswrapper[4580]: I0321 05:22:00.156208 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:22:00 crc kubenswrapper[4580]: I0321 05:22:00.158548 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:22:00 crc kubenswrapper[4580]: I0321 05:22:00.304731 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lxsk\" (UniqueName: \"kubernetes.io/projected/b750a30f-e9aa-4e9d-950c-66ee25d90139-kube-api-access-6lxsk\") pod \"auto-csr-approver-29567842-7t89q\" (UID: \"b750a30f-e9aa-4e9d-950c-66ee25d90139\") " pod="openshift-infra/auto-csr-approver-29567842-7t89q" Mar 21 05:22:00 crc kubenswrapper[4580]: I0321 05:22:00.406704 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lxsk\" (UniqueName: \"kubernetes.io/projected/b750a30f-e9aa-4e9d-950c-66ee25d90139-kube-api-access-6lxsk\") pod \"auto-csr-approver-29567842-7t89q\" (UID: \"b750a30f-e9aa-4e9d-950c-66ee25d90139\") " pod="openshift-infra/auto-csr-approver-29567842-7t89q" Mar 21 05:22:00 crc kubenswrapper[4580]: I0321 05:22:00.424314 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lxsk\" (UniqueName: \"kubernetes.io/projected/b750a30f-e9aa-4e9d-950c-66ee25d90139-kube-api-access-6lxsk\") pod \"auto-csr-approver-29567842-7t89q\" (UID: \"b750a30f-e9aa-4e9d-950c-66ee25d90139\") " pod="openshift-infra/auto-csr-approver-29567842-7t89q" Mar 21 05:22:00 crc kubenswrapper[4580]: I0321 05:22:00.466275 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-7t89q" Mar 21 05:22:00 crc kubenswrapper[4580]: I0321 05:22:00.954638 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-7t89q"] Mar 21 05:22:01 crc kubenswrapper[4580]: I0321 05:22:01.449704 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567842-7t89q" event={"ID":"b750a30f-e9aa-4e9d-950c-66ee25d90139","Type":"ContainerStarted","Data":"2426c5be1afaa73c76a5b57e44207d9e627f64f0dd2df06d80d2b30aaf2b0327"} Mar 21 05:22:02 crc kubenswrapper[4580]: I0321 05:22:02.463148 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567842-7t89q" event={"ID":"b750a30f-e9aa-4e9d-950c-66ee25d90139","Type":"ContainerStarted","Data":"d39090e5cec9269bdce8924e7aaa9430a8ca451cd67cff60d8c78ac08d6de8eb"} Mar 21 05:22:02 crc kubenswrapper[4580]: I0321 05:22:02.482588 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567842-7t89q" podStartSLOduration=1.313563249 podStartE2EDuration="2.482569581s" podCreationTimestamp="2026-03-21 05:22:00 +0000 UTC" firstStartedPulling="2026-03-21 05:22:00.952296158 +0000 UTC m=+1826.034879796" lastFinishedPulling="2026-03-21 05:22:02.1213025 +0000 UTC m=+1827.203886128" observedRunningTime="2026-03-21 05:22:02.474988277 +0000 UTC m=+1827.557571915" watchObservedRunningTime="2026-03-21 05:22:02.482569581 +0000 UTC m=+1827.565153209" Mar 21 05:22:03 crc kubenswrapper[4580]: I0321 05:22:03.481202 4580 generic.go:334] "Generic (PLEG): container finished" podID="b750a30f-e9aa-4e9d-950c-66ee25d90139" containerID="d39090e5cec9269bdce8924e7aaa9430a8ca451cd67cff60d8c78ac08d6de8eb" exitCode=0 Mar 21 05:22:03 crc kubenswrapper[4580]: I0321 05:22:03.481452 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567842-7t89q" event={"ID":"b750a30f-e9aa-4e9d-950c-66ee25d90139","Type":"ContainerDied","Data":"d39090e5cec9269bdce8924e7aaa9430a8ca451cd67cff60d8c78ac08d6de8eb"} Mar 21 05:22:04 crc kubenswrapper[4580]: I0321 05:22:04.919619 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-7t89q" Mar 21 05:22:04 crc kubenswrapper[4580]: I0321 05:22:04.994957 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lxsk\" (UniqueName: \"kubernetes.io/projected/b750a30f-e9aa-4e9d-950c-66ee25d90139-kube-api-access-6lxsk\") pod \"b750a30f-e9aa-4e9d-950c-66ee25d90139\" (UID: \"b750a30f-e9aa-4e9d-950c-66ee25d90139\") " Mar 21 05:22:05 crc kubenswrapper[4580]: I0321 05:22:05.002426 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b750a30f-e9aa-4e9d-950c-66ee25d90139-kube-api-access-6lxsk" (OuterVolumeSpecName: "kube-api-access-6lxsk") pod "b750a30f-e9aa-4e9d-950c-66ee25d90139" (UID: "b750a30f-e9aa-4e9d-950c-66ee25d90139"). InnerVolumeSpecName "kube-api-access-6lxsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:22:05 crc kubenswrapper[4580]: I0321 05:22:05.097559 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lxsk\" (UniqueName: \"kubernetes.io/projected/b750a30f-e9aa-4e9d-950c-66ee25d90139-kube-api-access-6lxsk\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:05 crc kubenswrapper[4580]: I0321 05:22:05.496487 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567842-7t89q" event={"ID":"b750a30f-e9aa-4e9d-950c-66ee25d90139","Type":"ContainerDied","Data":"2426c5be1afaa73c76a5b57e44207d9e627f64f0dd2df06d80d2b30aaf2b0327"} Mar 21 05:22:05 crc kubenswrapper[4580]: I0321 05:22:05.496756 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2426c5be1afaa73c76a5b57e44207d9e627f64f0dd2df06d80d2b30aaf2b0327" Mar 21 05:22:05 crc kubenswrapper[4580]: I0321 05:22:05.496520 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-7t89q" Mar 21 05:22:05 crc kubenswrapper[4580]: I0321 05:22:05.542188 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-4prkc"] Mar 21 05:22:05 crc kubenswrapper[4580]: I0321 05:22:05.552030 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-4prkc"] Mar 21 05:22:05 crc kubenswrapper[4580]: I0321 05:22:05.633422 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe65e038-1d7f-470f-88ea-ef5352681356" path="/var/lib/kubelet/pods/fe65e038-1d7f-470f-88ea-ef5352681356/volumes" Mar 21 05:22:07 crc kubenswrapper[4580]: I0321 05:22:07.618724 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:22:07 crc kubenswrapper[4580]: E0321 05:22:07.619239 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:22:11 crc kubenswrapper[4580]: I0321 05:22:11.366829 4580 scope.go:117] "RemoveContainer" containerID="a4d48039e22c74290ecbab3c2c18b56cecfaa3b35f380680d35279a2419f88eb" Mar 21 05:22:11 crc kubenswrapper[4580]: I0321 05:22:11.414095 4580 scope.go:117] "RemoveContainer" containerID="05237fba639f5471d66275a17fc51decfe9e9c5944d0cd65f44b60a5f33bf9cf" Mar 21 05:22:11 crc kubenswrapper[4580]: I0321 05:22:11.482964 4580 scope.go:117] "RemoveContainer" containerID="5216f1062d76a636ca1b786578bf61c096a09052c1ea40ecb1f3b558aad85422" Mar 21 05:22:11 crc kubenswrapper[4580]: I0321 05:22:11.520911 4580 scope.go:117] "RemoveContainer" containerID="e91bc741dfda5f6fd527ee94e9c01e60a3aec34ca4a7466b7e49b84456bfa05d" Mar 21 05:22:11 crc kubenswrapper[4580]: I0321 05:22:11.562169 4580 scope.go:117] "RemoveContainer" containerID="0ca53a7f97daaabc25f3bf440d40fc5b8f4aa72da12926007c9f913f5c361660" Mar 21 05:22:11 crc kubenswrapper[4580]: I0321 05:22:11.618699 4580 scope.go:117] "RemoveContainer" containerID="151a2d4133d4fda411cb7e2461df9558ff73ef73cfc87d2612c67236b6aae24b" Mar 21 05:22:11 crc kubenswrapper[4580]: I0321 05:22:11.642326 4580 scope.go:117] "RemoveContainer" containerID="4abb427e8ef3ed91ba0d42cfab260c30f4ef95eb6459fe530829f6d00e9b02fe" Mar 21 05:22:11 crc kubenswrapper[4580]: I0321 05:22:11.680591 4580 scope.go:117] "RemoveContainer" containerID="f90afca2acfaaae581f27bc22e5b2a0a0d1e1168cc4e0427b1e6653bed4a6160" Mar 21 05:22:11 crc kubenswrapper[4580]: I0321 05:22:11.776980 4580 scope.go:117] "RemoveContainer" containerID="6e80d1c51411e1a99aac75aeae886dd19082c3fdfd0ed8fb3896665f03fcc330" Mar 21 05:22:11 crc kubenswrapper[4580]: I0321 05:22:11.802082 4580 scope.go:117] "RemoveContainer" containerID="75aed0c59374edf1569b77da32cdb9b7e67e7e26d83e53c2564c247d2d8910a6" Mar 21 05:22:11 crc kubenswrapper[4580]: I0321 05:22:11.829141 4580 scope.go:117] "RemoveContainer" containerID="5b4ddf91f9e1f80a62c63f53c4c525fda532989ab0b4bc8ca18d83e29b62ab2b" Mar 21 05:22:11 crc kubenswrapper[4580]: I0321 05:22:11.855519 4580 scope.go:117] "RemoveContainer" containerID="28fe3b05e2cf1e604a0b3a0b38c1d58bf2380cd6082055556dd6d4c3f3bbdb53" Mar 21 05:22:11 crc kubenswrapper[4580]: I0321 05:22:11.879901 4580 scope.go:117] "RemoveContainer" containerID="35e740c8a272a667638641c3119b6bc3f021d8e408a408c6e63666b815dd286f" Mar 21 05:22:11 crc kubenswrapper[4580]: I0321 05:22:11.936050 4580 scope.go:117] "RemoveContainer" containerID="2b29bf2ce9732a549b39900bc56fa49051a6e8c505c055ac216f900b7fce97c3" Mar 21 05:22:12 crc kubenswrapper[4580]: I0321 05:22:12.011041 4580 scope.go:117] "RemoveContainer" containerID="88709bbb3c283cf655abc2c25069420c06e5ed4452325b4f16aa2da2adc0772a" Mar 21 05:22:12 crc kubenswrapper[4580]: I0321 05:22:12.038076 4580 scope.go:117] "RemoveContainer" containerID="da08ecf9ae625bdbd7251b180ce869bbfb2690a1bde5bb47b6452d0382dfd919" Mar 21 05:22:19 crc kubenswrapper[4580]: I0321 05:22:19.618605 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:22:19 crc kubenswrapper[4580]: E0321 05:22:19.619338 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:22:20 crc kubenswrapper[4580]: I0321 05:22:20.035703 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-zxh5l"] Mar 21 05:22:20 crc kubenswrapper[4580]: I0321 05:22:20.043655 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-zxh5l"] Mar 21 05:22:21 crc kubenswrapper[4580]: I0321 05:22:21.629173 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d29c081-c37a-46ac-8354-178685882ce2" path="/var/lib/kubelet/pods/0d29c081-c37a-46ac-8354-178685882ce2/volumes" Mar 21 05:22:32 crc kubenswrapper[4580]: I0321 05:22:32.618402 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:22:32 crc kubenswrapper[4580]: E0321 05:22:32.619142 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:22:34 crc kubenswrapper[4580]: I0321 05:22:34.785567 4580 generic.go:334] "Generic (PLEG): container finished" podID="ddfb2a5d-1386-4dac-aee6-316bce48c76b" containerID="da0209769142ac3b2ec0c66fe34ef45ac2bbffd6d067e76fef5e3b7b479f9100" exitCode=0 Mar 21 05:22:34 crc kubenswrapper[4580]: I0321 05:22:34.785681 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" event={"ID":"ddfb2a5d-1386-4dac-aee6-316bce48c76b","Type":"ContainerDied","Data":"da0209769142ac3b2ec0c66fe34ef45ac2bbffd6d067e76fef5e3b7b479f9100"} Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.244123 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.442433 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-inventory\") pod \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.442644 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-bootstrap-combined-ca-bundle\") pod \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.442717 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mllz2\" (UniqueName: \"kubernetes.io/projected/ddfb2a5d-1386-4dac-aee6-316bce48c76b-kube-api-access-mllz2\") pod \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.442798 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-ssh-key-openstack-edpm-ipam\") pod \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\" (UID: \"ddfb2a5d-1386-4dac-aee6-316bce48c76b\") " Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.449540 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ddfb2a5d-1386-4dac-aee6-316bce48c76b" (UID: "ddfb2a5d-1386-4dac-aee6-316bce48c76b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.451924 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddfb2a5d-1386-4dac-aee6-316bce48c76b-kube-api-access-mllz2" (OuterVolumeSpecName: "kube-api-access-mllz2") pod "ddfb2a5d-1386-4dac-aee6-316bce48c76b" (UID: "ddfb2a5d-1386-4dac-aee6-316bce48c76b"). InnerVolumeSpecName "kube-api-access-mllz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.471089 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ddfb2a5d-1386-4dac-aee6-316bce48c76b" (UID: "ddfb2a5d-1386-4dac-aee6-316bce48c76b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.472433 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-inventory" (OuterVolumeSpecName: "inventory") pod "ddfb2a5d-1386-4dac-aee6-316bce48c76b" (UID: "ddfb2a5d-1386-4dac-aee6-316bce48c76b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.544559 4580 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.544610 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mllz2\" (UniqueName: \"kubernetes.io/projected/ddfb2a5d-1386-4dac-aee6-316bce48c76b-kube-api-access-mllz2\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.544622 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.544635 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddfb2a5d-1386-4dac-aee6-316bce48c76b-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.804730 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" event={"ID":"ddfb2a5d-1386-4dac-aee6-316bce48c76b","Type":"ContainerDied","Data":"76482ee405395c145cf4a1c8fd8db39b10f291cd8da4060cb0b57dde1184fc9b"} Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.805022 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76482ee405395c145cf4a1c8fd8db39b10f291cd8da4060cb0b57dde1184fc9b" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.804862 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.905040 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d"] Mar 21 05:22:36 crc kubenswrapper[4580]: E0321 05:22:36.905402 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b750a30f-e9aa-4e9d-950c-66ee25d90139" containerName="oc" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.905417 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b750a30f-e9aa-4e9d-950c-66ee25d90139" containerName="oc" Mar 21 05:22:36 crc kubenswrapper[4580]: E0321 05:22:36.905432 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddfb2a5d-1386-4dac-aee6-316bce48c76b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.905441 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfb2a5d-1386-4dac-aee6-316bce48c76b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.905610 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddfb2a5d-1386-4dac-aee6-316bce48c76b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.905646 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="b750a30f-e9aa-4e9d-950c-66ee25d90139" containerName="oc" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.906261 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.909021 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.909214 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.909844 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.914798 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:22:36 crc kubenswrapper[4580]: I0321 05:22:36.932631 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d"] Mar 21 05:22:37 crc kubenswrapper[4580]: I0321 05:22:37.054404 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fl44d\" (UID: \"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" Mar 21 05:22:37 crc kubenswrapper[4580]: I0321 05:22:37.054517 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fl44d\" (UID: \"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" Mar 21 05:22:37 crc kubenswrapper[4580]: I0321 05:22:37.055009 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzd8z\" (UniqueName: \"kubernetes.io/projected/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-kube-api-access-nzd8z\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fl44d\" (UID: \"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" Mar 21 05:22:37 crc kubenswrapper[4580]: I0321 05:22:37.156350 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzd8z\" (UniqueName: \"kubernetes.io/projected/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-kube-api-access-nzd8z\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fl44d\" (UID: \"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" Mar 21 05:22:37 crc kubenswrapper[4580]: I0321 05:22:37.156423 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fl44d\" (UID: \"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" Mar 21 05:22:37 crc kubenswrapper[4580]: I0321 05:22:37.156457 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fl44d\" (UID: \"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" Mar 21 05:22:37 crc kubenswrapper[4580]: I0321 05:22:37.161048 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fl44d\" (UID: \"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" Mar 21 05:22:37 crc kubenswrapper[4580]: I0321 05:22:37.162563 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fl44d\" (UID: \"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" Mar 21 05:22:37 crc kubenswrapper[4580]: I0321 05:22:37.172804 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzd8z\" (UniqueName: \"kubernetes.io/projected/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-kube-api-access-nzd8z\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fl44d\" (UID: \"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" Mar 21 05:22:37 crc kubenswrapper[4580]: I0321 05:22:37.224184 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" Mar 21 05:22:37 crc kubenswrapper[4580]: I0321 05:22:37.777306 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d"] Mar 21 05:22:37 crc kubenswrapper[4580]: I0321 05:22:37.824055 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" event={"ID":"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f","Type":"ContainerStarted","Data":"245f9e6dca60717edf72904ecb94532c2d756d4ad60ace22661a9a30b26dc454"} Mar 21 05:22:39 crc kubenswrapper[4580]: I0321 05:22:39.842642 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" event={"ID":"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f","Type":"ContainerStarted","Data":"438988393f712c308fff629e9676354cb6897623dcbb9a5b979cf7f33cae635c"} Mar 21 05:22:39 crc kubenswrapper[4580]: I0321 05:22:39.869186 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" podStartSLOduration=2.996473635 podStartE2EDuration="3.869158568s" podCreationTimestamp="2026-03-21 05:22:36 +0000 UTC" firstStartedPulling="2026-03-21 05:22:37.781755625 +0000 UTC m=+1862.864339263" lastFinishedPulling="2026-03-21 05:22:38.654440568 +0000 UTC m=+1863.737024196" observedRunningTime="2026-03-21 05:22:39.86030571 +0000 UTC m=+1864.942889348" watchObservedRunningTime="2026-03-21 05:22:39.869158568 +0000 UTC m=+1864.951742206" Mar 21 05:22:45 crc kubenswrapper[4580]: I0321 05:22:45.623930 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:22:45 crc kubenswrapper[4580]: E0321 05:22:45.624903 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:23:00 crc kubenswrapper[4580]: I0321 05:23:00.618649 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:23:00 crc kubenswrapper[4580]: E0321 05:23:00.619357 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:23:11 crc kubenswrapper[4580]: I0321 05:23:11.345437 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-n4vzq"] Mar 21 05:23:11 crc kubenswrapper[4580]: I0321 05:23:11.355860 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-n4vzq"] Mar 21 05:23:11 crc kubenswrapper[4580]: I0321 05:23:11.630076 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f86c26eb-bb22-460c-8a45-191a02924112" path="/var/lib/kubelet/pods/f86c26eb-bb22-460c-8a45-191a02924112/volumes" Mar 21 05:23:12 crc kubenswrapper[4580]: I0321 05:23:12.460710 4580 scope.go:117] "RemoveContainer" containerID="3ff385e1bbcee324592b75b1425bea21d93e458294c849c13f312dc58af9311a" Mar 21 05:23:12 crc kubenswrapper[4580]: I0321 05:23:12.535450 4580 scope.go:117] "RemoveContainer" containerID="2feacbdc69c1b37fcb3920f7fd6f4520a27eed9b90712074b5be2625073e72b4" Mar 21 05:23:13 crc kubenswrapper[4580]: I0321 05:23:13.618422 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:23:13 crc kubenswrapper[4580]: E0321 05:23:13.619158 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:23:18 crc kubenswrapper[4580]: I0321 05:23:18.030901 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-cgz9l"] Mar 21 05:23:18 crc kubenswrapper[4580]: I0321 05:23:18.039168 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-cgz9l"] Mar 21 05:23:19 crc kubenswrapper[4580]: I0321 05:23:19.632215 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2234c053-0318-4d03-8e9f-b9ee569529fc" path="/var/lib/kubelet/pods/2234c053-0318-4d03-8e9f-b9ee569529fc/volumes" Mar 21 05:23:21 crc kubenswrapper[4580]: I0321 05:23:21.035319 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2tfwn"] Mar 21 05:23:21 crc kubenswrapper[4580]: I0321 05:23:21.044440 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2tfwn"] Mar 21 05:23:21 crc kubenswrapper[4580]: I0321 05:23:21.650940 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3391bccc-2f7a-469a-8166-5ce7169e9917" path="/var/lib/kubelet/pods/3391bccc-2f7a-469a-8166-5ce7169e9917/volumes" Mar 21 05:23:25 crc kubenswrapper[4580]: I0321 05:23:25.627433 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:23:25 crc kubenswrapper[4580]: E0321 05:23:25.627989 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:23:34 crc kubenswrapper[4580]: I0321 05:23:34.036354 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-f2ksl"] Mar 21 05:23:34 crc kubenswrapper[4580]: I0321 05:23:34.042957 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-m7rwg"] Mar 21 05:23:34 crc kubenswrapper[4580]: I0321 05:23:34.053979 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-tb6cr"] Mar 21 05:23:34 crc kubenswrapper[4580]: I0321 05:23:34.066293 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-f2ksl"] Mar 21 05:23:34 crc kubenswrapper[4580]: I0321 05:23:34.074178 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-tb6cr"] Mar 21 05:23:34 crc kubenswrapper[4580]: I0321 05:23:34.082015 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-m7rwg"] Mar 21 05:23:35 crc kubenswrapper[4580]: I0321 05:23:35.639889 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bfbab08-aee7-43bf-9118-252682438c95" path="/var/lib/kubelet/pods/3bfbab08-aee7-43bf-9118-252682438c95/volumes" Mar 21 05:23:35 crc kubenswrapper[4580]: I0321 05:23:35.641872 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55568564-a701-4c16-b5c4-617f88c364a5" path="/var/lib/kubelet/pods/55568564-a701-4c16-b5c4-617f88c364a5/volumes" Mar 21 05:23:35 crc kubenswrapper[4580]: I0321 05:23:35.644038 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1916415-d4eb-4dbd-bccb-ac932a09843c" path="/var/lib/kubelet/pods/b1916415-d4eb-4dbd-bccb-ac932a09843c/volumes" Mar 21 05:23:38 crc kubenswrapper[4580]: I0321 05:23:38.618407 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:23:38 crc kubenswrapper[4580]: E0321 05:23:38.619201 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:23:49 crc kubenswrapper[4580]: I0321 05:23:49.617995 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:23:49 crc kubenswrapper[4580]: E0321 05:23:49.618625 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:24:00 crc kubenswrapper[4580]: I0321 05:24:00.141975 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567844-nn75n"] Mar 21 05:24:00 crc kubenswrapper[4580]: I0321 05:24:00.144084 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-nn75n" Mar 21 05:24:00 crc kubenswrapper[4580]: I0321 05:24:00.146339 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:24:00 crc kubenswrapper[4580]: I0321 05:24:00.146474 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:24:00 crc kubenswrapper[4580]: I0321 05:24:00.146554 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:24:00 crc kubenswrapper[4580]: I0321 05:24:00.150686 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-nn75n"] Mar 21 05:24:00 crc kubenswrapper[4580]: I0321 05:24:00.323400 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klzwl\" (UniqueName: \"kubernetes.io/projected/487961e4-9ba8-498a-8330-ef215db7eb8e-kube-api-access-klzwl\") pod \"auto-csr-approver-29567844-nn75n\" (UID: \"487961e4-9ba8-498a-8330-ef215db7eb8e\") " pod="openshift-infra/auto-csr-approver-29567844-nn75n" Mar 21 05:24:00 crc kubenswrapper[4580]: I0321 05:24:00.424757 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klzwl\" (UniqueName: \"kubernetes.io/projected/487961e4-9ba8-498a-8330-ef215db7eb8e-kube-api-access-klzwl\") pod \"auto-csr-approver-29567844-nn75n\" (UID: \"487961e4-9ba8-498a-8330-ef215db7eb8e\") " pod="openshift-infra/auto-csr-approver-29567844-nn75n" Mar 21 05:24:00 crc kubenswrapper[4580]: I0321 05:24:00.444926 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klzwl\" (UniqueName: \"kubernetes.io/projected/487961e4-9ba8-498a-8330-ef215db7eb8e-kube-api-access-klzwl\") pod \"auto-csr-approver-29567844-nn75n\" (UID: \"487961e4-9ba8-498a-8330-ef215db7eb8e\") " pod="openshift-infra/auto-csr-approver-29567844-nn75n" Mar 21 05:24:00 crc kubenswrapper[4580]: I0321 05:24:00.464026 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-nn75n" Mar 21 05:24:00 crc kubenswrapper[4580]: I0321 05:24:00.908891 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-nn75n"] Mar 21 05:24:00 crc kubenswrapper[4580]: I0321 05:24:00.917581 4580 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:24:01 crc kubenswrapper[4580]: I0321 05:24:01.738377 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567844-nn75n" event={"ID":"487961e4-9ba8-498a-8330-ef215db7eb8e","Type":"ContainerStarted","Data":"701461244e65998d90018322682d8c8b0f8bd17c9f86bc4cb7c9a4d43284b77d"} Mar 21 05:24:02 crc kubenswrapper[4580]: I0321 05:24:02.747270 4580 generic.go:334] "Generic (PLEG): container finished" podID="487961e4-9ba8-498a-8330-ef215db7eb8e" containerID="5f9d64afaa5aed2bc32b9e3a39c16db91918bf5aa80be5631dae3f3adc43f0f1" exitCode=0 Mar 21 05:24:02 crc kubenswrapper[4580]: I0321 05:24:02.747374 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567844-nn75n" event={"ID":"487961e4-9ba8-498a-8330-ef215db7eb8e","Type":"ContainerDied","Data":"5f9d64afaa5aed2bc32b9e3a39c16db91918bf5aa80be5631dae3f3adc43f0f1"} Mar 21 05:24:04 crc kubenswrapper[4580]: I0321 05:24:04.094704 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-nn75n" Mar 21 05:24:04 crc kubenswrapper[4580]: I0321 05:24:04.193278 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klzwl\" (UniqueName: \"kubernetes.io/projected/487961e4-9ba8-498a-8330-ef215db7eb8e-kube-api-access-klzwl\") pod \"487961e4-9ba8-498a-8330-ef215db7eb8e\" (UID: \"487961e4-9ba8-498a-8330-ef215db7eb8e\") " Mar 21 05:24:04 crc kubenswrapper[4580]: I0321 05:24:04.205998 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487961e4-9ba8-498a-8330-ef215db7eb8e-kube-api-access-klzwl" (OuterVolumeSpecName: "kube-api-access-klzwl") pod "487961e4-9ba8-498a-8330-ef215db7eb8e" (UID: "487961e4-9ba8-498a-8330-ef215db7eb8e"). InnerVolumeSpecName "kube-api-access-klzwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:24:04 crc kubenswrapper[4580]: I0321 05:24:04.296267 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klzwl\" (UniqueName: \"kubernetes.io/projected/487961e4-9ba8-498a-8330-ef215db7eb8e-kube-api-access-klzwl\") on node \"crc\" DevicePath \"\"" Mar 21 05:24:04 crc kubenswrapper[4580]: I0321 05:24:04.617859 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:24:04 crc kubenswrapper[4580]: E0321 05:24:04.618457 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:24:04 crc kubenswrapper[4580]: I0321 05:24:04.765378 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567844-nn75n" event={"ID":"487961e4-9ba8-498a-8330-ef215db7eb8e","Type":"ContainerDied","Data":"701461244e65998d90018322682d8c8b0f8bd17c9f86bc4cb7c9a4d43284b77d"} Mar 21 05:24:04 crc kubenswrapper[4580]: I0321 05:24:04.765415 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="701461244e65998d90018322682d8c8b0f8bd17c9f86bc4cb7c9a4d43284b77d" Mar 21 05:24:04 crc kubenswrapper[4580]: I0321 05:24:04.765470 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-nn75n" Mar 21 05:24:05 crc kubenswrapper[4580]: I0321 05:24:05.168928 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-4qndx"] Mar 21 05:24:05 crc kubenswrapper[4580]: I0321 05:24:05.177375 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-4qndx"] Mar 21 05:24:05 crc kubenswrapper[4580]: I0321 05:24:05.630315 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b0c3fc8-6255-45ef-9192-4160848b545a" path="/var/lib/kubelet/pods/7b0c3fc8-6255-45ef-9192-4160848b545a/volumes" Mar 21 05:24:05 crc kubenswrapper[4580]: I0321 05:24:05.775993 4580 generic.go:334] "Generic (PLEG): container finished" podID="73c35bcd-08ba-44f4-96c4-4d29bcf84b5f" containerID="438988393f712c308fff629e9676354cb6897623dcbb9a5b979cf7f33cae635c" exitCode=0 Mar 21 05:24:05 crc kubenswrapper[4580]: I0321 05:24:05.776081 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" event={"ID":"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f","Type":"ContainerDied","Data":"438988393f712c308fff629e9676354cb6897623dcbb9a5b979cf7f33cae635c"} Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.381802 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.459586 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-ssh-key-openstack-edpm-ipam\") pod \"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f\" (UID: \"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f\") " Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.459801 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-inventory\") pod \"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f\" (UID: \"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f\") " Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.459866 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzd8z\" (UniqueName: \"kubernetes.io/projected/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-kube-api-access-nzd8z\") pod \"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f\" (UID: \"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f\") " Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.466683 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-kube-api-access-nzd8z" (OuterVolumeSpecName: "kube-api-access-nzd8z") pod "73c35bcd-08ba-44f4-96c4-4d29bcf84b5f" (UID: "73c35bcd-08ba-44f4-96c4-4d29bcf84b5f"). InnerVolumeSpecName "kube-api-access-nzd8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.486069 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-inventory" (OuterVolumeSpecName: "inventory") pod "73c35bcd-08ba-44f4-96c4-4d29bcf84b5f" (UID: "73c35bcd-08ba-44f4-96c4-4d29bcf84b5f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.488289 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "73c35bcd-08ba-44f4-96c4-4d29bcf84b5f" (UID: "73c35bcd-08ba-44f4-96c4-4d29bcf84b5f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.562191 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.562230 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzd8z\" (UniqueName: \"kubernetes.io/projected/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-kube-api-access-nzd8z\") on node \"crc\" DevicePath \"\"" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.562242 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73c35bcd-08ba-44f4-96c4-4d29bcf84b5f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.795511 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" event={"ID":"73c35bcd-08ba-44f4-96c4-4d29bcf84b5f","Type":"ContainerDied","Data":"245f9e6dca60717edf72904ecb94532c2d756d4ad60ace22661a9a30b26dc454"} Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.795558 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="245f9e6dca60717edf72904ecb94532c2d756d4ad60ace22661a9a30b26dc454" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.795885 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fl44d" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.907236 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn"] Mar 21 05:24:07 crc kubenswrapper[4580]: E0321 05:24:07.907891 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487961e4-9ba8-498a-8330-ef215db7eb8e" containerName="oc" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.907906 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="487961e4-9ba8-498a-8330-ef215db7eb8e" containerName="oc" Mar 21 05:24:07 crc kubenswrapper[4580]: E0321 05:24:07.907922 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c35bcd-08ba-44f4-96c4-4d29bcf84b5f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.907929 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c35bcd-08ba-44f4-96c4-4d29bcf84b5f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.908107 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="487961e4-9ba8-498a-8330-ef215db7eb8e" containerName="oc" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.908118 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c35bcd-08ba-44f4-96c4-4d29bcf84b5f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.908714 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.915685 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.917481 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.918622 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.923823 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:24:07 crc kubenswrapper[4580]: I0321 05:24:07.938200 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn"] Mar 21 05:24:08 crc kubenswrapper[4580]: I0321 05:24:08.070543 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5014a479-6112-4f5c-9824-db4736d248f4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4chpn\" (UID: \"5014a479-6112-4f5c-9824-db4736d248f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" Mar 21 05:24:08 crc kubenswrapper[4580]: I0321 05:24:08.070843 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5014a479-6112-4f5c-9824-db4736d248f4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4chpn\" (UID: \"5014a479-6112-4f5c-9824-db4736d248f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" Mar 21 05:24:08 crc kubenswrapper[4580]: I0321 05:24:08.071021 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6svbj\" (UniqueName: \"kubernetes.io/projected/5014a479-6112-4f5c-9824-db4736d248f4-kube-api-access-6svbj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4chpn\" (UID: \"5014a479-6112-4f5c-9824-db4736d248f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" Mar 21 05:24:08 crc kubenswrapper[4580]: I0321 05:24:08.172717 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5014a479-6112-4f5c-9824-db4736d248f4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4chpn\" (UID: \"5014a479-6112-4f5c-9824-db4736d248f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" Mar 21 05:24:08 crc kubenswrapper[4580]: I0321 05:24:08.172871 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6svbj\" (UniqueName: \"kubernetes.io/projected/5014a479-6112-4f5c-9824-db4736d248f4-kube-api-access-6svbj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4chpn\" (UID: \"5014a479-6112-4f5c-9824-db4736d248f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" Mar 21 05:24:08 crc kubenswrapper[4580]: I0321 05:24:08.173052 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5014a479-6112-4f5c-9824-db4736d248f4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4chpn\" (UID: \"5014a479-6112-4f5c-9824-db4736d248f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" Mar 21 05:24:08 crc kubenswrapper[4580]: I0321 05:24:08.177318 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5014a479-6112-4f5c-9824-db4736d248f4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4chpn\" (UID: \"5014a479-6112-4f5c-9824-db4736d248f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" Mar 21 05:24:08 crc kubenswrapper[4580]: I0321 05:24:08.177318 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5014a479-6112-4f5c-9824-db4736d248f4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4chpn\" (UID: \"5014a479-6112-4f5c-9824-db4736d248f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" Mar 21 05:24:08 crc kubenswrapper[4580]: I0321 05:24:08.189172 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6svbj\" (UniqueName: \"kubernetes.io/projected/5014a479-6112-4f5c-9824-db4736d248f4-kube-api-access-6svbj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4chpn\" (UID: \"5014a479-6112-4f5c-9824-db4736d248f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" Mar 21 05:24:08 crc kubenswrapper[4580]: I0321 05:24:08.226954 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" Mar 21 05:24:08 crc kubenswrapper[4580]: I0321 05:24:08.884345 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn"] Mar 21 05:24:09 crc kubenswrapper[4580]: I0321 05:24:09.814921 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" event={"ID":"5014a479-6112-4f5c-9824-db4736d248f4","Type":"ContainerStarted","Data":"b2a5341e13292a79d7b4fdda167ea20107f427b5453d54b9dae170bdb3b5c2d4"} Mar 21 05:24:09 crc kubenswrapper[4580]: I0321 05:24:09.815483 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" event={"ID":"5014a479-6112-4f5c-9824-db4736d248f4","Type":"ContainerStarted","Data":"b84f5b2ad7806d65e44c57390f7ad3c355a19357adea5fe121b3e7802c2ae788"} Mar 21 05:24:09 crc kubenswrapper[4580]: I0321 05:24:09.838096 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" podStartSLOduration=2.220329649 podStartE2EDuration="2.838056108s" podCreationTimestamp="2026-03-21 05:24:07 +0000 UTC" firstStartedPulling="2026-03-21 05:24:08.88444803 +0000 UTC m=+1953.967031658" lastFinishedPulling="2026-03-21 05:24:09.502174479 +0000 UTC m=+1954.584758117" observedRunningTime="2026-03-21 05:24:09.832655713 +0000 UTC m=+1954.915239361" watchObservedRunningTime="2026-03-21 05:24:09.838056108 +0000 UTC m=+1954.920639726" Mar 21 05:24:12 crc kubenswrapper[4580]: I0321 05:24:12.613298 4580 scope.go:117] "RemoveContainer" containerID="80003388c09c49a1e50de6ecfae462a1a23aa4e34d1c67cab721f187e4fb3b28" Mar 21 05:24:12 crc kubenswrapper[4580]: I0321 05:24:12.669076 4580 scope.go:117] "RemoveContainer" containerID="5271d2e1c3d8b200abfe020ec89602d597e379128fb07749c2f006b86f466527" Mar 21 05:24:12 crc kubenswrapper[4580]: I0321 05:24:12.697345 4580 scope.go:117] "RemoveContainer" containerID="8f80ca1e7bd3a43740a103e7a0d8280fd86914f170cee6de45d81273597cb752" Mar 21 05:24:12 crc kubenswrapper[4580]: I0321 05:24:12.748598 4580 scope.go:117] "RemoveContainer" containerID="cf1fbd9cf540e60eee221e084401254071b0f33e589ff7e0ba9e0bd010cc35ad" Mar 21 05:24:12 crc kubenswrapper[4580]: I0321 05:24:12.809189 4580 scope.go:117] "RemoveContainer" containerID="74309c1e8a8be97da322068060f5c3c7724a07d0be54c0e9b9f1e700809e5b3b" Mar 21 05:24:12 crc kubenswrapper[4580]: I0321 05:24:12.854185 4580 scope.go:117] "RemoveContainer" containerID="217ccbfd9b38375bf06fdf2171c10e2a95a0662f2f6a33b874445488a76b2f60" Mar 21 05:24:18 crc kubenswrapper[4580]: I0321 05:24:18.618409 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:24:18 crc kubenswrapper[4580]: E0321 05:24:18.619214 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:24:30 crc kubenswrapper[4580]: I0321 05:24:30.617558 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:24:30 crc kubenswrapper[4580]: E0321 05:24:30.618325 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:24:43 crc kubenswrapper[4580]: I0321 05:24:43.619383 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:24:43 crc kubenswrapper[4580]: E0321 05:24:43.620160 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:24:50 crc kubenswrapper[4580]: I0321 05:24:50.046209 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-mtd7h"] Mar 21 05:24:50 crc kubenswrapper[4580]: I0321 05:24:50.058475 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-mtd7h"] Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.036056 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6631-account-create-update-ndfnr"] Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.047664 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-t6f99"] Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.060150 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wtwjf"] Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.070920 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5b39-account-create-update-b7fwt"] Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.078509 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8a46-account-create-update-nh9zc"] Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.086678 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5b39-account-create-update-b7fwt"] Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.094316 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8a46-account-create-update-nh9zc"] Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.101975 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6631-account-create-update-ndfnr"] Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.110337 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wtwjf"] Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.119189 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-t6f99"] Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.630100 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bd7c6be-cb25-4e54-9d83-c40520b86d3e" path="/var/lib/kubelet/pods/0bd7c6be-cb25-4e54-9d83-c40520b86d3e/volumes" Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.631286 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="387f8115-6262-4bbf-9277-1452a5b29e47" path="/var/lib/kubelet/pods/387f8115-6262-4bbf-9277-1452a5b29e47/volumes" Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.632350 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45a8958a-c142-475e-b5c6-3d405ad76dda" path="/var/lib/kubelet/pods/45a8958a-c142-475e-b5c6-3d405ad76dda/volumes" Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.633279 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89a60a11-c3ae-4f01-8397-dd383dc3fc64" path="/var/lib/kubelet/pods/89a60a11-c3ae-4f01-8397-dd383dc3fc64/volumes" Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.634295 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9923e811-8726-478a-8357-55d473752028" path="/var/lib/kubelet/pods/9923e811-8726-478a-8357-55d473752028/volumes" Mar 21 05:24:51 crc kubenswrapper[4580]: I0321 05:24:51.635419 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6fc275e-ec41-4c09-b451-67136fa617fd" path="/var/lib/kubelet/pods/d6fc275e-ec41-4c09-b451-67136fa617fd/volumes" Mar 21 05:24:55 crc kubenswrapper[4580]: I0321 05:24:55.623936 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:24:55 crc kubenswrapper[4580]: E0321 05:24:55.624729 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:25:08 crc kubenswrapper[4580]: I0321 05:25:08.618865 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:25:08 crc kubenswrapper[4580]: E0321 05:25:08.621774 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:25:13 crc kubenswrapper[4580]: I0321 05:25:13.019895 4580 scope.go:117] "RemoveContainer" containerID="8b0e0e7ba8683b5fd23ac9c1cfeca30cf4da6cd0f9ef8dc566c0975e27087bb4" Mar 21 05:25:13 crc kubenswrapper[4580]: I0321 05:25:13.055058 4580 scope.go:117] "RemoveContainer" containerID="797d5aec4bba2ca59b1854fbcc9dcb2f7bccaa718ac9f7b5b34226adf9a55fcf" Mar 21 05:25:13 crc kubenswrapper[4580]: I0321 05:25:13.103051 4580 scope.go:117] "RemoveContainer" containerID="12fff88dca4bf339f4df2f7748bd281eb6b5ec069d6370fa0c5d87f4a41ea20b" Mar 21 05:25:13 crc kubenswrapper[4580]: I0321 05:25:13.135217 4580 scope.go:117] "RemoveContainer" containerID="bc8c6a6099ae8c2062e398024d430a92d1ff54bb196cf5d1d6ad9b967e3550a4" Mar 21 05:25:13 crc kubenswrapper[4580]: I0321 05:25:13.222329 4580 scope.go:117] "RemoveContainer" containerID="1145ff06f67b0de3fa2e9a762e4cd67564c504f94ad4bac001be275aa6d4bc17" Mar 21 05:25:13 crc kubenswrapper[4580]: I0321 05:25:13.256821 4580 scope.go:117] "RemoveContainer" containerID="efb914583c5e717216d083dc481c5adcb0509376eed0a20386e169e9acdebfc7" Mar 21 05:25:21 crc kubenswrapper[4580]: I0321 05:25:21.618075 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:25:21 crc kubenswrapper[4580]: E0321 05:25:21.618858 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:25:22 crc kubenswrapper[4580]: I0321 05:25:22.469375 4580 generic.go:334] "Generic (PLEG): container finished" podID="5014a479-6112-4f5c-9824-db4736d248f4" containerID="b2a5341e13292a79d7b4fdda167ea20107f427b5453d54b9dae170bdb3b5c2d4" exitCode=0 Mar 21 05:25:22 crc kubenswrapper[4580]: I0321 05:25:22.469447 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" event={"ID":"5014a479-6112-4f5c-9824-db4736d248f4","Type":"ContainerDied","Data":"b2a5341e13292a79d7b4fdda167ea20107f427b5453d54b9dae170bdb3b5c2d4"} Mar 21 05:25:23 crc kubenswrapper[4580]: I0321 05:25:23.899411 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" Mar 21 05:25:23 crc kubenswrapper[4580]: I0321 05:25:23.989989 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5014a479-6112-4f5c-9824-db4736d248f4-inventory\") pod \"5014a479-6112-4f5c-9824-db4736d248f4\" (UID: \"5014a479-6112-4f5c-9824-db4736d248f4\") " Mar 21 05:25:23 crc kubenswrapper[4580]: I0321 05:25:23.990197 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5014a479-6112-4f5c-9824-db4736d248f4-ssh-key-openstack-edpm-ipam\") pod \"5014a479-6112-4f5c-9824-db4736d248f4\" (UID: \"5014a479-6112-4f5c-9824-db4736d248f4\") " Mar 21 05:25:23 crc kubenswrapper[4580]: I0321 05:25:23.990252 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6svbj\" (UniqueName: \"kubernetes.io/projected/5014a479-6112-4f5c-9824-db4736d248f4-kube-api-access-6svbj\") pod \"5014a479-6112-4f5c-9824-db4736d248f4\" (UID: \"5014a479-6112-4f5c-9824-db4736d248f4\") " Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.002039 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5014a479-6112-4f5c-9824-db4736d248f4-kube-api-access-6svbj" (OuterVolumeSpecName: "kube-api-access-6svbj") pod "5014a479-6112-4f5c-9824-db4736d248f4" (UID: "5014a479-6112-4f5c-9824-db4736d248f4"). InnerVolumeSpecName "kube-api-access-6svbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.016696 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5014a479-6112-4f5c-9824-db4736d248f4-inventory" (OuterVolumeSpecName: "inventory") pod "5014a479-6112-4f5c-9824-db4736d248f4" (UID: "5014a479-6112-4f5c-9824-db4736d248f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.027622 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5014a479-6112-4f5c-9824-db4736d248f4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5014a479-6112-4f5c-9824-db4736d248f4" (UID: "5014a479-6112-4f5c-9824-db4736d248f4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.092483 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5014a479-6112-4f5c-9824-db4736d248f4-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.092550 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5014a479-6112-4f5c-9824-db4736d248f4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.092564 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6svbj\" (UniqueName: \"kubernetes.io/projected/5014a479-6112-4f5c-9824-db4736d248f4-kube-api-access-6svbj\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.494160 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" event={"ID":"5014a479-6112-4f5c-9824-db4736d248f4","Type":"ContainerDied","Data":"b84f5b2ad7806d65e44c57390f7ad3c355a19357adea5fe121b3e7802c2ae788"} Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.494498 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b84f5b2ad7806d65e44c57390f7ad3c355a19357adea5fe121b3e7802c2ae788" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.494222 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4chpn" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.597162 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2"] Mar 21 05:25:24 crc kubenswrapper[4580]: E0321 05:25:24.598092 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5014a479-6112-4f5c-9824-db4736d248f4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.598121 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5014a479-6112-4f5c-9824-db4736d248f4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.598402 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="5014a479-6112-4f5c-9824-db4736d248f4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.599415 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.609236 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2"] Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.610449 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.610596 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.610643 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.612085 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.707070 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c58wm\" (UniqueName: \"kubernetes.io/projected/01313952-673b-45c9-b24b-0317ed817834-kube-api-access-c58wm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2\" (UID: \"01313952-673b-45c9-b24b-0317ed817834\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.707250 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01313952-673b-45c9-b24b-0317ed817834-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2\" (UID: \"01313952-673b-45c9-b24b-0317ed817834\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.707358 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01313952-673b-45c9-b24b-0317ed817834-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2\" (UID: \"01313952-673b-45c9-b24b-0317ed817834\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.809399 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c58wm\" (UniqueName: \"kubernetes.io/projected/01313952-673b-45c9-b24b-0317ed817834-kube-api-access-c58wm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2\" (UID: \"01313952-673b-45c9-b24b-0317ed817834\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.809770 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01313952-673b-45c9-b24b-0317ed817834-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2\" (UID: \"01313952-673b-45c9-b24b-0317ed817834\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.809935 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01313952-673b-45c9-b24b-0317ed817834-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2\" (UID: \"01313952-673b-45c9-b24b-0317ed817834\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.816538 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01313952-673b-45c9-b24b-0317ed817834-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2\" (UID: \"01313952-673b-45c9-b24b-0317ed817834\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.817256 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01313952-673b-45c9-b24b-0317ed817834-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2\" (UID: \"01313952-673b-45c9-b24b-0317ed817834\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.840390 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c58wm\" (UniqueName: \"kubernetes.io/projected/01313952-673b-45c9-b24b-0317ed817834-kube-api-access-c58wm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2\" (UID: \"01313952-673b-45c9-b24b-0317ed817834\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" Mar 21 05:25:24 crc kubenswrapper[4580]: I0321 05:25:24.931522 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" Mar 21 05:25:25 crc kubenswrapper[4580]: I0321 05:25:25.438026 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2"] Mar 21 05:25:25 crc kubenswrapper[4580]: I0321 05:25:25.505533 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" event={"ID":"01313952-673b-45c9-b24b-0317ed817834","Type":"ContainerStarted","Data":"388bd357b398644db0b5756b09bd0f38fdb7658934ac34de02323c063ab6de70"} Mar 21 05:25:26 crc kubenswrapper[4580]: I0321 05:25:26.515969 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" event={"ID":"01313952-673b-45c9-b24b-0317ed817834","Type":"ContainerStarted","Data":"beab0a9deb29014522512be884e35f5761c5ddca57e6cab4197483347e3cdf7b"} Mar 21 05:25:26 crc kubenswrapper[4580]: I0321 05:25:26.534613 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" podStartSLOduration=2.075361993 podStartE2EDuration="2.534591736s" podCreationTimestamp="2026-03-21 05:25:24 +0000 UTC" firstStartedPulling="2026-03-21 05:25:25.452513289 +0000 UTC m=+2030.535096917" lastFinishedPulling="2026-03-21 05:25:25.911743032 +0000 UTC m=+2030.994326660" observedRunningTime="2026-03-21 05:25:26.532229642 +0000 UTC m=+2031.614813300" watchObservedRunningTime="2026-03-21 05:25:26.534591736 +0000 UTC m=+2031.617175374" Mar 21 05:25:30 crc kubenswrapper[4580]: I0321 05:25:30.550862 4580 generic.go:334] "Generic (PLEG): container finished" podID="01313952-673b-45c9-b24b-0317ed817834" containerID="beab0a9deb29014522512be884e35f5761c5ddca57e6cab4197483347e3cdf7b" exitCode=0 Mar 21 05:25:30 crc kubenswrapper[4580]: I0321 05:25:30.551041 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" event={"ID":"01313952-673b-45c9-b24b-0317ed817834","Type":"ContainerDied","Data":"beab0a9deb29014522512be884e35f5761c5ddca57e6cab4197483347e3cdf7b"} Mar 21 05:25:31 crc kubenswrapper[4580]: I0321 05:25:31.966573 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.056003 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01313952-673b-45c9-b24b-0317ed817834-inventory\") pod \"01313952-673b-45c9-b24b-0317ed817834\" (UID: \"01313952-673b-45c9-b24b-0317ed817834\") " Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.057424 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01313952-673b-45c9-b24b-0317ed817834-ssh-key-openstack-edpm-ipam\") pod \"01313952-673b-45c9-b24b-0317ed817834\" (UID: \"01313952-673b-45c9-b24b-0317ed817834\") " Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.057532 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c58wm\" (UniqueName: \"kubernetes.io/projected/01313952-673b-45c9-b24b-0317ed817834-kube-api-access-c58wm\") pod \"01313952-673b-45c9-b24b-0317ed817834\" (UID: \"01313952-673b-45c9-b24b-0317ed817834\") " Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.063297 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01313952-673b-45c9-b24b-0317ed817834-kube-api-access-c58wm" (OuterVolumeSpecName: "kube-api-access-c58wm") pod "01313952-673b-45c9-b24b-0317ed817834" (UID: "01313952-673b-45c9-b24b-0317ed817834"). InnerVolumeSpecName "kube-api-access-c58wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.086208 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01313952-673b-45c9-b24b-0317ed817834-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "01313952-673b-45c9-b24b-0317ed817834" (UID: "01313952-673b-45c9-b24b-0317ed817834"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.090009 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01313952-673b-45c9-b24b-0317ed817834-inventory" (OuterVolumeSpecName: "inventory") pod "01313952-673b-45c9-b24b-0317ed817834" (UID: "01313952-673b-45c9-b24b-0317ed817834"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.160240 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01313952-673b-45c9-b24b-0317ed817834-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.160431 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c58wm\" (UniqueName: \"kubernetes.io/projected/01313952-673b-45c9-b24b-0317ed817834-kube-api-access-c58wm\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.160525 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01313952-673b-45c9-b24b-0317ed817834-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.568200 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" event={"ID":"01313952-673b-45c9-b24b-0317ed817834","Type":"ContainerDied","Data":"388bd357b398644db0b5756b09bd0f38fdb7658934ac34de02323c063ab6de70"} Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.568240 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="388bd357b398644db0b5756b09bd0f38fdb7658934ac34de02323c063ab6de70" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.568249 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.673357 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc"] Mar 21 05:25:32 crc kubenswrapper[4580]: E0321 05:25:32.673795 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01313952-673b-45c9-b24b-0317ed817834" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.673812 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="01313952-673b-45c9-b24b-0317ed817834" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.673985 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="01313952-673b-45c9-b24b-0317ed817834" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.674646 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.678201 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.678600 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.680276 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.682365 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.685052 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc"] Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.773390 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6m2n\" (UniqueName: \"kubernetes.io/projected/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-kube-api-access-c6m2n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5b8sc\" (UID: \"42a67be6-2662-40e1-a94d-0b7fa55c1bc0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.773541 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5b8sc\" (UID: \"42a67be6-2662-40e1-a94d-0b7fa55c1bc0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.773906 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5b8sc\" (UID: \"42a67be6-2662-40e1-a94d-0b7fa55c1bc0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.875684 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5b8sc\" (UID: \"42a67be6-2662-40e1-a94d-0b7fa55c1bc0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.875825 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5b8sc\" (UID: \"42a67be6-2662-40e1-a94d-0b7fa55c1bc0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.875931 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6m2n\" (UniqueName: \"kubernetes.io/projected/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-kube-api-access-c6m2n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5b8sc\" (UID: \"42a67be6-2662-40e1-a94d-0b7fa55c1bc0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.879574 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5b8sc\" (UID: \"42a67be6-2662-40e1-a94d-0b7fa55c1bc0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.886736 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5b8sc\" (UID: \"42a67be6-2662-40e1-a94d-0b7fa55c1bc0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.899457 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6m2n\" (UniqueName: \"kubernetes.io/projected/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-kube-api-access-c6m2n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5b8sc\" (UID: \"42a67be6-2662-40e1-a94d-0b7fa55c1bc0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" Mar 21 05:25:32 crc kubenswrapper[4580]: I0321 05:25:32.992288 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" Mar 21 05:25:33 crc kubenswrapper[4580]: I0321 05:25:33.482029 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc"] Mar 21 05:25:33 crc kubenswrapper[4580]: I0321 05:25:33.576615 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" event={"ID":"42a67be6-2662-40e1-a94d-0b7fa55c1bc0","Type":"ContainerStarted","Data":"029aa26ec4c203feba30036895d613e6ff678f3689f9db6903b23b5d6a0b3677"} Mar 21 05:25:33 crc kubenswrapper[4580]: I0321 05:25:33.618713 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:25:33 crc kubenswrapper[4580]: E0321 05:25:33.619089 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:25:34 crc kubenswrapper[4580]: I0321 05:25:34.052148 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vsxgp"] Mar 21 05:25:34 crc kubenswrapper[4580]: I0321 05:25:34.064435 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vsxgp"] Mar 21 05:25:34 crc kubenswrapper[4580]: I0321 05:25:34.585440 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" event={"ID":"42a67be6-2662-40e1-a94d-0b7fa55c1bc0","Type":"ContainerStarted","Data":"6b58a7d9a23f2e4614099bfcea63b6808413b9b2841ab2683d2495483a0c3582"} Mar 21 05:25:34 crc kubenswrapper[4580]: I0321 05:25:34.610704 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" podStartSLOduration=2.15766681 podStartE2EDuration="2.610682225s" podCreationTimestamp="2026-03-21 05:25:32 +0000 UTC" firstStartedPulling="2026-03-21 05:25:33.5060664 +0000 UTC m=+2038.588650028" lastFinishedPulling="2026-03-21 05:25:33.959081805 +0000 UTC m=+2039.041665443" observedRunningTime="2026-03-21 05:25:34.59973825 +0000 UTC m=+2039.682321888" watchObservedRunningTime="2026-03-21 05:25:34.610682225 +0000 UTC m=+2039.693265853" Mar 21 05:25:35 crc kubenswrapper[4580]: I0321 05:25:35.629654 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2804338-19fa-40f9-9945-23cabe223f46" path="/var/lib/kubelet/pods/d2804338-19fa-40f9-9945-23cabe223f46/volumes" Mar 21 05:25:45 crc kubenswrapper[4580]: I0321 05:25:45.626910 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:25:45 crc kubenswrapper[4580]: E0321 05:25:45.627738 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:25:58 crc kubenswrapper[4580]: I0321 05:25:58.618633 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:25:58 crc kubenswrapper[4580]: E0321 05:25:58.619512 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:26:00 crc kubenswrapper[4580]: I0321 05:26:00.143826 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567846-rszs5"] Mar 21 05:26:00 crc kubenswrapper[4580]: I0321 05:26:00.145410 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-rszs5" Mar 21 05:26:00 crc kubenswrapper[4580]: I0321 05:26:00.150685 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:26:00 crc kubenswrapper[4580]: I0321 05:26:00.151136 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:26:00 crc kubenswrapper[4580]: I0321 05:26:00.151640 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:26:00 crc kubenswrapper[4580]: I0321 05:26:00.161924 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-rszs5"] Mar 21 05:26:00 crc kubenswrapper[4580]: I0321 05:26:00.347407 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2wrh\" (UniqueName: \"kubernetes.io/projected/297cda95-71d8-4ad1-a1b2-a83494cb6cb6-kube-api-access-b2wrh\") pod \"auto-csr-approver-29567846-rszs5\" (UID: \"297cda95-71d8-4ad1-a1b2-a83494cb6cb6\") " pod="openshift-infra/auto-csr-approver-29567846-rszs5" Mar 21 05:26:00 crc kubenswrapper[4580]: I0321 05:26:00.449535 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2wrh\" (UniqueName: \"kubernetes.io/projected/297cda95-71d8-4ad1-a1b2-a83494cb6cb6-kube-api-access-b2wrh\") pod \"auto-csr-approver-29567846-rszs5\" (UID: \"297cda95-71d8-4ad1-a1b2-a83494cb6cb6\") " pod="openshift-infra/auto-csr-approver-29567846-rszs5" Mar 21 05:26:00 crc kubenswrapper[4580]: I0321 05:26:00.469091 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2wrh\" (UniqueName: \"kubernetes.io/projected/297cda95-71d8-4ad1-a1b2-a83494cb6cb6-kube-api-access-b2wrh\") pod \"auto-csr-approver-29567846-rszs5\" (UID: \"297cda95-71d8-4ad1-a1b2-a83494cb6cb6\") " pod="openshift-infra/auto-csr-approver-29567846-rszs5" Mar 21 05:26:00 crc kubenswrapper[4580]: I0321 05:26:00.768276 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-rszs5" Mar 21 05:26:01 crc kubenswrapper[4580]: I0321 05:26:01.204583 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-rszs5"] Mar 21 05:26:01 crc kubenswrapper[4580]: I0321 05:26:01.794684 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567846-rszs5" event={"ID":"297cda95-71d8-4ad1-a1b2-a83494cb6cb6","Type":"ContainerStarted","Data":"e72d4925923c2d91a01cdcbeb94d838ea512d167202144e5566029a8dc8b23c0"} Mar 21 05:26:02 crc kubenswrapper[4580]: I0321 05:26:02.806715 4580 generic.go:334] "Generic (PLEG): container finished" podID="297cda95-71d8-4ad1-a1b2-a83494cb6cb6" containerID="ad8d35f03fb40259df66cd1a7110886a3335661d8a6666636390f3283059d5eb" exitCode=0 Mar 21 05:26:02 crc kubenswrapper[4580]: I0321 05:26:02.806896 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567846-rszs5" event={"ID":"297cda95-71d8-4ad1-a1b2-a83494cb6cb6","Type":"ContainerDied","Data":"ad8d35f03fb40259df66cd1a7110886a3335661d8a6666636390f3283059d5eb"} Mar 21 05:26:04 crc kubenswrapper[4580]: I0321 05:26:04.160614 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-rszs5" Mar 21 05:26:04 crc kubenswrapper[4580]: I0321 05:26:04.331341 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2wrh\" (UniqueName: \"kubernetes.io/projected/297cda95-71d8-4ad1-a1b2-a83494cb6cb6-kube-api-access-b2wrh\") pod \"297cda95-71d8-4ad1-a1b2-a83494cb6cb6\" (UID: \"297cda95-71d8-4ad1-a1b2-a83494cb6cb6\") " Mar 21 05:26:04 crc kubenswrapper[4580]: I0321 05:26:04.349186 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/297cda95-71d8-4ad1-a1b2-a83494cb6cb6-kube-api-access-b2wrh" (OuterVolumeSpecName: "kube-api-access-b2wrh") pod "297cda95-71d8-4ad1-a1b2-a83494cb6cb6" (UID: "297cda95-71d8-4ad1-a1b2-a83494cb6cb6"). InnerVolumeSpecName "kube-api-access-b2wrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:26:04 crc kubenswrapper[4580]: I0321 05:26:04.435106 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2wrh\" (UniqueName: \"kubernetes.io/projected/297cda95-71d8-4ad1-a1b2-a83494cb6cb6-kube-api-access-b2wrh\") on node \"crc\" DevicePath \"\"" Mar 21 05:26:04 crc kubenswrapper[4580]: I0321 05:26:04.824064 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567846-rszs5" event={"ID":"297cda95-71d8-4ad1-a1b2-a83494cb6cb6","Type":"ContainerDied","Data":"e72d4925923c2d91a01cdcbeb94d838ea512d167202144e5566029a8dc8b23c0"} Mar 21 05:26:04 crc kubenswrapper[4580]: I0321 05:26:04.824501 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e72d4925923c2d91a01cdcbeb94d838ea512d167202144e5566029a8dc8b23c0" Mar 21 05:26:04 crc kubenswrapper[4580]: I0321 05:26:04.824171 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-rszs5" Mar 21 05:26:05 crc kubenswrapper[4580]: I0321 05:26:05.267359 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-wstd5"] Mar 21 05:26:05 crc kubenswrapper[4580]: I0321 05:26:05.274584 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-wstd5"] Mar 21 05:26:05 crc kubenswrapper[4580]: I0321 05:26:05.628945 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb" path="/var/lib/kubelet/pods/1f69fb6f-5183-4ea6-b6b7-4ebcddb73adb/volumes" Mar 21 05:26:08 crc kubenswrapper[4580]: I0321 05:26:08.057620 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-jw68q"] Mar 21 05:26:08 crc kubenswrapper[4580]: I0321 05:26:08.070184 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-jw68q"] Mar 21 05:26:09 crc kubenswrapper[4580]: I0321 05:26:09.629722 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1da4990-e129-41f6-acca-138ab10c03cc" path="/var/lib/kubelet/pods/c1da4990-e129-41f6-acca-138ab10c03cc/volumes" Mar 21 05:26:12 crc kubenswrapper[4580]: I0321 05:26:12.192421 4580 generic.go:334] "Generic (PLEG): container finished" podID="42a67be6-2662-40e1-a94d-0b7fa55c1bc0" containerID="6b58a7d9a23f2e4614099bfcea63b6808413b9b2841ab2683d2495483a0c3582" exitCode=0 Mar 21 05:26:12 crc kubenswrapper[4580]: I0321 05:26:12.192513 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" event={"ID":"42a67be6-2662-40e1-a94d-0b7fa55c1bc0","Type":"ContainerDied","Data":"6b58a7d9a23f2e4614099bfcea63b6808413b9b2841ab2683d2495483a0c3582"} Mar 21 05:26:13 crc kubenswrapper[4580]: I0321 05:26:13.396117 4580 scope.go:117] "RemoveContainer" containerID="ca3d9cf7e617312d01d2a1e54bb27fc9e109de41500d1f435dc31c73a03ad3fc" Mar 21 05:26:13 crc kubenswrapper[4580]: I0321 05:26:13.456217 4580 scope.go:117] "RemoveContainer" containerID="fe16d7cd3730205ea2ae2ee287da9eedbeeb418df9addf927524aa933f7be59b" Mar 21 05:26:13 crc kubenswrapper[4580]: I0321 05:26:13.568252 4580 scope.go:117] "RemoveContainer" containerID="e1e71939e2eba028b0abb4f53083b37edb6d7a87fb7124ab1f067b21bbb4c602" Mar 21 05:26:13 crc kubenswrapper[4580]: I0321 05:26:13.617858 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:26:13 crc kubenswrapper[4580]: E0321 05:26:13.618179 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:26:13 crc kubenswrapper[4580]: I0321 05:26:13.715457 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" Mar 21 05:26:13 crc kubenswrapper[4580]: I0321 05:26:13.742510 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-inventory\") pod \"42a67be6-2662-40e1-a94d-0b7fa55c1bc0\" (UID: \"42a67be6-2662-40e1-a94d-0b7fa55c1bc0\") " Mar 21 05:26:13 crc kubenswrapper[4580]: I0321 05:26:13.742570 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-ssh-key-openstack-edpm-ipam\") pod \"42a67be6-2662-40e1-a94d-0b7fa55c1bc0\" (UID: \"42a67be6-2662-40e1-a94d-0b7fa55c1bc0\") " Mar 21 05:26:13 crc kubenswrapper[4580]: I0321 05:26:13.742660 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6m2n\" (UniqueName: \"kubernetes.io/projected/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-kube-api-access-c6m2n\") pod \"42a67be6-2662-40e1-a94d-0b7fa55c1bc0\" (UID: \"42a67be6-2662-40e1-a94d-0b7fa55c1bc0\") " Mar 21 05:26:13 crc kubenswrapper[4580]: I0321 05:26:13.747599 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-kube-api-access-c6m2n" (OuterVolumeSpecName: "kube-api-access-c6m2n") pod "42a67be6-2662-40e1-a94d-0b7fa55c1bc0" (UID: "42a67be6-2662-40e1-a94d-0b7fa55c1bc0"). InnerVolumeSpecName "kube-api-access-c6m2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:26:13 crc kubenswrapper[4580]: I0321 05:26:13.772466 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-inventory" (OuterVolumeSpecName: "inventory") pod "42a67be6-2662-40e1-a94d-0b7fa55c1bc0" (UID: "42a67be6-2662-40e1-a94d-0b7fa55c1bc0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:26:13 crc kubenswrapper[4580]: I0321 05:26:13.776962 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "42a67be6-2662-40e1-a94d-0b7fa55c1bc0" (UID: "42a67be6-2662-40e1-a94d-0b7fa55c1bc0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:26:13 crc kubenswrapper[4580]: I0321 05:26:13.846007 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6m2n\" (UniqueName: \"kubernetes.io/projected/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-kube-api-access-c6m2n\") on node \"crc\" DevicePath \"\"" Mar 21 05:26:13 crc kubenswrapper[4580]: I0321 05:26:13.846052 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:26:13 crc kubenswrapper[4580]: I0321 05:26:13.846066 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42a67be6-2662-40e1-a94d-0b7fa55c1bc0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.211145 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.211121 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5b8sc" event={"ID":"42a67be6-2662-40e1-a94d-0b7fa55c1bc0","Type":"ContainerDied","Data":"029aa26ec4c203feba30036895d613e6ff678f3689f9db6903b23b5d6a0b3677"} Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.211307 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="029aa26ec4c203feba30036895d613e6ff678f3689f9db6903b23b5d6a0b3677" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.333541 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n"] Mar 21 05:26:14 crc kubenswrapper[4580]: E0321 05:26:14.333943 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="297cda95-71d8-4ad1-a1b2-a83494cb6cb6" containerName="oc" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.333960 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="297cda95-71d8-4ad1-a1b2-a83494cb6cb6" containerName="oc" Mar 21 05:26:14 crc kubenswrapper[4580]: E0321 05:26:14.333989 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42a67be6-2662-40e1-a94d-0b7fa55c1bc0" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.333997 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="42a67be6-2662-40e1-a94d-0b7fa55c1bc0" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.334172 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="42a67be6-2662-40e1-a94d-0b7fa55c1bc0" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.334194 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="297cda95-71d8-4ad1-a1b2-a83494cb6cb6" containerName="oc" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.334828 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.340687 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.341060 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.341218 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.341420 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.349425 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n"] Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.376859 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fn5z\" (UniqueName: \"kubernetes.io/projected/136b85ae-b1b7-46cf-a8fa-059f29999f31-kube-api-access-7fn5z\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gck9n\" (UID: \"136b85ae-b1b7-46cf-a8fa-059f29999f31\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.376962 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/136b85ae-b1b7-46cf-a8fa-059f29999f31-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gck9n\" (UID: \"136b85ae-b1b7-46cf-a8fa-059f29999f31\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.377258 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136b85ae-b1b7-46cf-a8fa-059f29999f31-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gck9n\" (UID: \"136b85ae-b1b7-46cf-a8fa-059f29999f31\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.479262 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fn5z\" (UniqueName: \"kubernetes.io/projected/136b85ae-b1b7-46cf-a8fa-059f29999f31-kube-api-access-7fn5z\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gck9n\" (UID: \"136b85ae-b1b7-46cf-a8fa-059f29999f31\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.480022 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/136b85ae-b1b7-46cf-a8fa-059f29999f31-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gck9n\" (UID: \"136b85ae-b1b7-46cf-a8fa-059f29999f31\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.480332 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136b85ae-b1b7-46cf-a8fa-059f29999f31-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gck9n\" (UID: \"136b85ae-b1b7-46cf-a8fa-059f29999f31\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.484848 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/136b85ae-b1b7-46cf-a8fa-059f29999f31-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gck9n\" (UID: \"136b85ae-b1b7-46cf-a8fa-059f29999f31\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.485635 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136b85ae-b1b7-46cf-a8fa-059f29999f31-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gck9n\" (UID: \"136b85ae-b1b7-46cf-a8fa-059f29999f31\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.498307 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fn5z\" (UniqueName: \"kubernetes.io/projected/136b85ae-b1b7-46cf-a8fa-059f29999f31-kube-api-access-7fn5z\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gck9n\" (UID: \"136b85ae-b1b7-46cf-a8fa-059f29999f31\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" Mar 21 05:26:14 crc kubenswrapper[4580]: I0321 05:26:14.671987 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" Mar 21 05:26:15 crc kubenswrapper[4580]: I0321 05:26:15.221970 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n"] Mar 21 05:26:15 crc kubenswrapper[4580]: W0321 05:26:15.224763 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod136b85ae_b1b7_46cf_a8fa_059f29999f31.slice/crio-4afe23daac8f99bfcc82444d00e4a961f71e00a089aeab8726e38ed8686f984c WatchSource:0}: Error finding container 4afe23daac8f99bfcc82444d00e4a961f71e00a089aeab8726e38ed8686f984c: Status 404 returned error can't find the container with id 4afe23daac8f99bfcc82444d00e4a961f71e00a089aeab8726e38ed8686f984c Mar 21 05:26:16 crc kubenswrapper[4580]: I0321 05:26:16.228857 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" event={"ID":"136b85ae-b1b7-46cf-a8fa-059f29999f31","Type":"ContainerStarted","Data":"ec98e6d2c12904f93c6dbc6f1a26cab9b918b9cd6a816fe9a74cbb2a4b1c8108"} Mar 21 05:26:16 crc kubenswrapper[4580]: I0321 05:26:16.229282 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" event={"ID":"136b85ae-b1b7-46cf-a8fa-059f29999f31","Type":"ContainerStarted","Data":"4afe23daac8f99bfcc82444d00e4a961f71e00a089aeab8726e38ed8686f984c"} Mar 21 05:26:16 crc kubenswrapper[4580]: I0321 05:26:16.250427 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" podStartSLOduration=1.7822621170000001 podStartE2EDuration="2.250407161s" podCreationTimestamp="2026-03-21 05:26:14 +0000 UTC" firstStartedPulling="2026-03-21 05:26:15.228449313 +0000 UTC m=+2080.311032931" lastFinishedPulling="2026-03-21 05:26:15.696594347 +0000 UTC m=+2080.779177975" observedRunningTime="2026-03-21 05:26:16.245119849 +0000 UTC m=+2081.327703497" watchObservedRunningTime="2026-03-21 05:26:16.250407161 +0000 UTC m=+2081.332990789" Mar 21 05:26:20 crc kubenswrapper[4580]: I0321 05:26:20.132894 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-76pxg"] Mar 21 05:26:20 crc kubenswrapper[4580]: I0321 05:26:20.135222 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:26:20 crc kubenswrapper[4580]: I0321 05:26:20.149730 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-76pxg"] Mar 21 05:26:20 crc kubenswrapper[4580]: I0321 05:26:20.307084 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gbdx\" (UniqueName: \"kubernetes.io/projected/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-kube-api-access-9gbdx\") pod \"redhat-operators-76pxg\" (UID: \"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a\") " pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:26:20 crc kubenswrapper[4580]: I0321 05:26:20.307144 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-catalog-content\") pod \"redhat-operators-76pxg\" (UID: \"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a\") " pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:26:20 crc kubenswrapper[4580]: I0321 05:26:20.307250 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-utilities\") pod \"redhat-operators-76pxg\" (UID: \"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a\") " pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:26:20 crc kubenswrapper[4580]: I0321 05:26:20.409257 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gbdx\" (UniqueName: \"kubernetes.io/projected/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-kube-api-access-9gbdx\") pod \"redhat-operators-76pxg\" (UID: \"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a\") " pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:26:20 crc kubenswrapper[4580]: I0321 05:26:20.409333 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-catalog-content\") pod \"redhat-operators-76pxg\" (UID: \"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a\") " pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:26:20 crc kubenswrapper[4580]: I0321 05:26:20.409852 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-catalog-content\") pod \"redhat-operators-76pxg\" (UID: \"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a\") " pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:26:20 crc kubenswrapper[4580]: I0321 05:26:20.410302 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-utilities\") pod \"redhat-operators-76pxg\" (UID: \"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a\") " pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:26:20 crc kubenswrapper[4580]: I0321 05:26:20.410032 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-utilities\") pod \"redhat-operators-76pxg\" (UID: \"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a\") " pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:26:20 crc kubenswrapper[4580]: I0321 05:26:20.427139 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gbdx\" (UniqueName: \"kubernetes.io/projected/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-kube-api-access-9gbdx\") pod \"redhat-operators-76pxg\" (UID: \"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a\") " pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:26:20 crc kubenswrapper[4580]: I0321 05:26:20.512731 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:26:20 crc kubenswrapper[4580]: I0321 05:26:20.996115 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-76pxg"] Mar 21 05:26:21 crc kubenswrapper[4580]: I0321 05:26:21.271037 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76pxg" event={"ID":"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a","Type":"ContainerStarted","Data":"3792a3a51ef25bac12e4db66dedc3f0de381993d0c487a2b047b2ff54670fa8c"} Mar 21 05:26:21 crc kubenswrapper[4580]: I0321 05:26:21.271073 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76pxg" event={"ID":"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a","Type":"ContainerStarted","Data":"724083d93e59580051a6c2e9275c5e78a8c3a7714700959b388fbe2657323079"} Mar 21 05:26:22 crc kubenswrapper[4580]: I0321 05:26:22.279862 4580 generic.go:334] "Generic (PLEG): container finished" podID="3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" containerID="3792a3a51ef25bac12e4db66dedc3f0de381993d0c487a2b047b2ff54670fa8c" exitCode=0 Mar 21 05:26:22 crc kubenswrapper[4580]: I0321 05:26:22.279945 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76pxg" event={"ID":"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a","Type":"ContainerDied","Data":"3792a3a51ef25bac12e4db66dedc3f0de381993d0c487a2b047b2ff54670fa8c"} Mar 21 05:26:23 crc kubenswrapper[4580]: I0321 05:26:23.292797 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76pxg" event={"ID":"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a","Type":"ContainerStarted","Data":"b1ecfe66339b1d7accd7793e6a876c54485679ea5bf3c9219a465ed7064ce436"} Mar 21 05:26:26 crc kubenswrapper[4580]: I0321 05:26:26.619473 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:26:26 crc kubenswrapper[4580]: E0321 05:26:26.621318 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:26:32 crc kubenswrapper[4580]: I0321 05:26:32.029735 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6xjtd"] Mar 21 05:26:32 crc kubenswrapper[4580]: I0321 05:26:32.038995 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6xjtd"] Mar 21 05:26:33 crc kubenswrapper[4580]: I0321 05:26:33.628625 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="394de333-b465-45df-8251-bb4ae573b135" path="/var/lib/kubelet/pods/394de333-b465-45df-8251-bb4ae573b135/volumes" Mar 21 05:26:34 crc kubenswrapper[4580]: I0321 05:26:34.383594 4580 generic.go:334] "Generic (PLEG): container finished" podID="3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" containerID="b1ecfe66339b1d7accd7793e6a876c54485679ea5bf3c9219a465ed7064ce436" exitCode=0 Mar 21 05:26:34 crc kubenswrapper[4580]: I0321 05:26:34.383690 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76pxg" event={"ID":"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a","Type":"ContainerDied","Data":"b1ecfe66339b1d7accd7793e6a876c54485679ea5bf3c9219a465ed7064ce436"} Mar 21 05:26:35 crc kubenswrapper[4580]: I0321 05:26:35.396058 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76pxg" event={"ID":"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a","Type":"ContainerStarted","Data":"ac074e5cad2f1814ddfef2108ae8256cd09de6925241d058610c064e6a156bf2"} Mar 21 05:26:35 crc kubenswrapper[4580]: I0321 05:26:35.419684 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-76pxg" podStartSLOduration=2.898276201 podStartE2EDuration="15.419662686s" podCreationTimestamp="2026-03-21 05:26:20 +0000 UTC" firstStartedPulling="2026-03-21 05:26:22.283712889 +0000 UTC m=+2087.366296517" lastFinishedPulling="2026-03-21 05:26:34.805099374 +0000 UTC m=+2099.887683002" observedRunningTime="2026-03-21 05:26:35.41240754 +0000 UTC m=+2100.494991208" watchObservedRunningTime="2026-03-21 05:26:35.419662686 +0000 UTC m=+2100.502246314" Mar 21 05:26:40 crc kubenswrapper[4580]: I0321 05:26:40.513671 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:26:40 crc kubenswrapper[4580]: I0321 05:26:40.514385 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:26:40 crc kubenswrapper[4580]: I0321 05:26:40.628421 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:26:40 crc kubenswrapper[4580]: E0321 05:26:40.629340 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:26:41 crc kubenswrapper[4580]: I0321 05:26:41.560693 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-76pxg" podUID="3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" containerName="registry-server" probeResult="failure" output=< Mar 21 05:26:41 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:26:41 crc kubenswrapper[4580]: > Mar 21 05:26:51 crc kubenswrapper[4580]: I0321 05:26:51.560872 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-76pxg" podUID="3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" containerName="registry-server" probeResult="failure" output=< Mar 21 05:26:51 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:26:51 crc kubenswrapper[4580]: > Mar 21 05:26:51 crc kubenswrapper[4580]: I0321 05:26:51.622316 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:26:52 crc kubenswrapper[4580]: I0321 05:26:52.585793 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"bcd83d2a37c6af8524563e27b746db2c348fe00e999b54fcbbce11e47079c45b"} Mar 21 05:26:57 crc kubenswrapper[4580]: I0321 05:26:57.047638 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-4z846"] Mar 21 05:26:57 crc kubenswrapper[4580]: I0321 05:26:57.056836 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-4z846"] Mar 21 05:26:57 crc kubenswrapper[4580]: I0321 05:26:57.632810 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd" path="/var/lib/kubelet/pods/0b4e71e7-c443-4524-b3ff-fa4a7ba9b5fd/volumes" Mar 21 05:27:01 crc kubenswrapper[4580]: I0321 05:27:01.565478 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-76pxg" podUID="3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" containerName="registry-server" probeResult="failure" output=< Mar 21 05:27:01 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:27:01 crc kubenswrapper[4580]: > Mar 21 05:27:06 crc kubenswrapper[4580]: I0321 05:27:06.702091 4580 generic.go:334] "Generic (PLEG): container finished" podID="136b85ae-b1b7-46cf-a8fa-059f29999f31" containerID="ec98e6d2c12904f93c6dbc6f1a26cab9b918b9cd6a816fe9a74cbb2a4b1c8108" exitCode=0 Mar 21 05:27:06 crc kubenswrapper[4580]: I0321 05:27:06.702433 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" event={"ID":"136b85ae-b1b7-46cf-a8fa-059f29999f31","Type":"ContainerDied","Data":"ec98e6d2c12904f93c6dbc6f1a26cab9b918b9cd6a816fe9a74cbb2a4b1c8108"} Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.113434 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.157666 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136b85ae-b1b7-46cf-a8fa-059f29999f31-inventory\") pod \"136b85ae-b1b7-46cf-a8fa-059f29999f31\" (UID: \"136b85ae-b1b7-46cf-a8fa-059f29999f31\") " Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.157759 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/136b85ae-b1b7-46cf-a8fa-059f29999f31-ssh-key-openstack-edpm-ipam\") pod \"136b85ae-b1b7-46cf-a8fa-059f29999f31\" (UID: \"136b85ae-b1b7-46cf-a8fa-059f29999f31\") " Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.157801 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fn5z\" (UniqueName: \"kubernetes.io/projected/136b85ae-b1b7-46cf-a8fa-059f29999f31-kube-api-access-7fn5z\") pod \"136b85ae-b1b7-46cf-a8fa-059f29999f31\" (UID: \"136b85ae-b1b7-46cf-a8fa-059f29999f31\") " Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.179427 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/136b85ae-b1b7-46cf-a8fa-059f29999f31-kube-api-access-7fn5z" (OuterVolumeSpecName: "kube-api-access-7fn5z") pod "136b85ae-b1b7-46cf-a8fa-059f29999f31" (UID: "136b85ae-b1b7-46cf-a8fa-059f29999f31"). InnerVolumeSpecName "kube-api-access-7fn5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.206279 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136b85ae-b1b7-46cf-a8fa-059f29999f31-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "136b85ae-b1b7-46cf-a8fa-059f29999f31" (UID: "136b85ae-b1b7-46cf-a8fa-059f29999f31"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.207765 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136b85ae-b1b7-46cf-a8fa-059f29999f31-inventory" (OuterVolumeSpecName: "inventory") pod "136b85ae-b1b7-46cf-a8fa-059f29999f31" (UID: "136b85ae-b1b7-46cf-a8fa-059f29999f31"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.260562 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136b85ae-b1b7-46cf-a8fa-059f29999f31-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.260600 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/136b85ae-b1b7-46cf-a8fa-059f29999f31-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.260613 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fn5z\" (UniqueName: \"kubernetes.io/projected/136b85ae-b1b7-46cf-a8fa-059f29999f31-kube-api-access-7fn5z\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.725388 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" event={"ID":"136b85ae-b1b7-46cf-a8fa-059f29999f31","Type":"ContainerDied","Data":"4afe23daac8f99bfcc82444d00e4a961f71e00a089aeab8726e38ed8686f984c"} Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.725425 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4afe23daac8f99bfcc82444d00e4a961f71e00a089aeab8726e38ed8686f984c" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.725494 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gck9n" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.828708 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cxplm"] Mar 21 05:27:08 crc kubenswrapper[4580]: E0321 05:27:08.829261 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="136b85ae-b1b7-46cf-a8fa-059f29999f31" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.829281 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="136b85ae-b1b7-46cf-a8fa-059f29999f31" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.829700 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="136b85ae-b1b7-46cf-a8fa-059f29999f31" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.830840 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.833113 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.834618 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.835024 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.835207 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.838477 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cxplm"] Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.871337 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m65zh\" (UniqueName: \"kubernetes.io/projected/b448b2a2-1171-4d9a-b28f-c0d8805134df-kube-api-access-m65zh\") pod \"ssh-known-hosts-edpm-deployment-cxplm\" (UID: \"b448b2a2-1171-4d9a-b28f-c0d8805134df\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.871467 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b448b2a2-1171-4d9a-b28f-c0d8805134df-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cxplm\" (UID: \"b448b2a2-1171-4d9a-b28f-c0d8805134df\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.871506 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b448b2a2-1171-4d9a-b28f-c0d8805134df-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cxplm\" (UID: \"b448b2a2-1171-4d9a-b28f-c0d8805134df\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.972877 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m65zh\" (UniqueName: \"kubernetes.io/projected/b448b2a2-1171-4d9a-b28f-c0d8805134df-kube-api-access-m65zh\") pod \"ssh-known-hosts-edpm-deployment-cxplm\" (UID: \"b448b2a2-1171-4d9a-b28f-c0d8805134df\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.973003 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b448b2a2-1171-4d9a-b28f-c0d8805134df-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cxplm\" (UID: \"b448b2a2-1171-4d9a-b28f-c0d8805134df\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.973032 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b448b2a2-1171-4d9a-b28f-c0d8805134df-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cxplm\" (UID: \"b448b2a2-1171-4d9a-b28f-c0d8805134df\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.976728 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b448b2a2-1171-4d9a-b28f-c0d8805134df-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cxplm\" (UID: \"b448b2a2-1171-4d9a-b28f-c0d8805134df\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.979225 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b448b2a2-1171-4d9a-b28f-c0d8805134df-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cxplm\" (UID: \"b448b2a2-1171-4d9a-b28f-c0d8805134df\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" Mar 21 05:27:08 crc kubenswrapper[4580]: I0321 05:27:08.989351 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m65zh\" (UniqueName: \"kubernetes.io/projected/b448b2a2-1171-4d9a-b28f-c0d8805134df-kube-api-access-m65zh\") pod \"ssh-known-hosts-edpm-deployment-cxplm\" (UID: \"b448b2a2-1171-4d9a-b28f-c0d8805134df\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" Mar 21 05:27:09 crc kubenswrapper[4580]: I0321 05:27:09.163476 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" Mar 21 05:27:09 crc kubenswrapper[4580]: I0321 05:27:09.709601 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cxplm"] Mar 21 05:27:09 crc kubenswrapper[4580]: I0321 05:27:09.738026 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" event={"ID":"b448b2a2-1171-4d9a-b28f-c0d8805134df","Type":"ContainerStarted","Data":"13d4fc6901ed42a9559dc4a0d23fd087adb716b41d9da20a7e635ea82cdaca63"} Mar 21 05:27:10 crc kubenswrapper[4580]: I0321 05:27:10.565097 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:27:10 crc kubenswrapper[4580]: I0321 05:27:10.616348 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:27:10 crc kubenswrapper[4580]: I0321 05:27:10.746940 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" event={"ID":"b448b2a2-1171-4d9a-b28f-c0d8805134df","Type":"ContainerStarted","Data":"0bce82820e5c6ff29db077090708162d5ba3162bd180ede130fe3fe170c3e7fe"} Mar 21 05:27:10 crc kubenswrapper[4580]: I0321 05:27:10.772485 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" podStartSLOduration=2.318760508 podStartE2EDuration="2.772462872s" podCreationTimestamp="2026-03-21 05:27:08 +0000 UTC" firstStartedPulling="2026-03-21 05:27:09.717711742 +0000 UTC m=+2134.800295370" lastFinishedPulling="2026-03-21 05:27:10.171414106 +0000 UTC m=+2135.253997734" observedRunningTime="2026-03-21 05:27:10.763404648 +0000 UTC m=+2135.845988276" watchObservedRunningTime="2026-03-21 05:27:10.772462872 +0000 UTC m=+2135.855046510" Mar 21 05:27:10 crc kubenswrapper[4580]: I0321 05:27:10.801104 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-76pxg"] Mar 21 05:27:11 crc kubenswrapper[4580]: I0321 05:27:11.754061 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-76pxg" podUID="3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" containerName="registry-server" containerID="cri-o://ac074e5cad2f1814ddfef2108ae8256cd09de6925241d058610c064e6a156bf2" gracePeriod=2 Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.261796 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.355309 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-catalog-content\") pod \"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a\" (UID: \"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a\") " Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.355402 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gbdx\" (UniqueName: \"kubernetes.io/projected/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-kube-api-access-9gbdx\") pod \"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a\" (UID: \"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a\") " Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.355432 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-utilities\") pod \"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a\" (UID: \"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a\") " Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.356891 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-utilities" (OuterVolumeSpecName: "utilities") pod "3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" (UID: "3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.361994 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-kube-api-access-9gbdx" (OuterVolumeSpecName: "kube-api-access-9gbdx") pod "3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" (UID: "3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a"). InnerVolumeSpecName "kube-api-access-9gbdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.457545 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gbdx\" (UniqueName: \"kubernetes.io/projected/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-kube-api-access-9gbdx\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.457579 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.509235 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" (UID: "3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.560017 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.767975 4580 generic.go:334] "Generic (PLEG): container finished" podID="3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" containerID="ac074e5cad2f1814ddfef2108ae8256cd09de6925241d058610c064e6a156bf2" exitCode=0 Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.768030 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76pxg" event={"ID":"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a","Type":"ContainerDied","Data":"ac074e5cad2f1814ddfef2108ae8256cd09de6925241d058610c064e6a156bf2"} Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.768054 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-76pxg" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.768077 4580 scope.go:117] "RemoveContainer" containerID="ac074e5cad2f1814ddfef2108ae8256cd09de6925241d058610c064e6a156bf2" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.768064 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76pxg" event={"ID":"3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a","Type":"ContainerDied","Data":"724083d93e59580051a6c2e9275c5e78a8c3a7714700959b388fbe2657323079"} Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.796465 4580 scope.go:117] "RemoveContainer" containerID="b1ecfe66339b1d7accd7793e6a876c54485679ea5bf3c9219a465ed7064ce436" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.811840 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-76pxg"] Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.821246 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-76pxg"] Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.836016 4580 scope.go:117] "RemoveContainer" containerID="3792a3a51ef25bac12e4db66dedc3f0de381993d0c487a2b047b2ff54670fa8c" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.882305 4580 scope.go:117] "RemoveContainer" containerID="ac074e5cad2f1814ddfef2108ae8256cd09de6925241d058610c064e6a156bf2" Mar 21 05:27:12 crc kubenswrapper[4580]: E0321 05:27:12.882720 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac074e5cad2f1814ddfef2108ae8256cd09de6925241d058610c064e6a156bf2\": container with ID starting with ac074e5cad2f1814ddfef2108ae8256cd09de6925241d058610c064e6a156bf2 not found: ID does not exist" containerID="ac074e5cad2f1814ddfef2108ae8256cd09de6925241d058610c064e6a156bf2" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.882754 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac074e5cad2f1814ddfef2108ae8256cd09de6925241d058610c064e6a156bf2"} err="failed to get container status \"ac074e5cad2f1814ddfef2108ae8256cd09de6925241d058610c064e6a156bf2\": rpc error: code = NotFound desc = could not find container \"ac074e5cad2f1814ddfef2108ae8256cd09de6925241d058610c064e6a156bf2\": container with ID starting with ac074e5cad2f1814ddfef2108ae8256cd09de6925241d058610c064e6a156bf2 not found: ID does not exist" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.882789 4580 scope.go:117] "RemoveContainer" containerID="b1ecfe66339b1d7accd7793e6a876c54485679ea5bf3c9219a465ed7064ce436" Mar 21 05:27:12 crc kubenswrapper[4580]: E0321 05:27:12.883015 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1ecfe66339b1d7accd7793e6a876c54485679ea5bf3c9219a465ed7064ce436\": container with ID starting with b1ecfe66339b1d7accd7793e6a876c54485679ea5bf3c9219a465ed7064ce436 not found: ID does not exist" containerID="b1ecfe66339b1d7accd7793e6a876c54485679ea5bf3c9219a465ed7064ce436" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.883094 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1ecfe66339b1d7accd7793e6a876c54485679ea5bf3c9219a465ed7064ce436"} err="failed to get container status \"b1ecfe66339b1d7accd7793e6a876c54485679ea5bf3c9219a465ed7064ce436\": rpc error: code = NotFound desc = could not find container \"b1ecfe66339b1d7accd7793e6a876c54485679ea5bf3c9219a465ed7064ce436\": container with ID starting with b1ecfe66339b1d7accd7793e6a876c54485679ea5bf3c9219a465ed7064ce436 not found: ID does not exist" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.883145 4580 scope.go:117] "RemoveContainer" containerID="3792a3a51ef25bac12e4db66dedc3f0de381993d0c487a2b047b2ff54670fa8c" Mar 21 05:27:12 crc kubenswrapper[4580]: E0321 05:27:12.883381 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3792a3a51ef25bac12e4db66dedc3f0de381993d0c487a2b047b2ff54670fa8c\": container with ID starting with 3792a3a51ef25bac12e4db66dedc3f0de381993d0c487a2b047b2ff54670fa8c not found: ID does not exist" containerID="3792a3a51ef25bac12e4db66dedc3f0de381993d0c487a2b047b2ff54670fa8c" Mar 21 05:27:12 crc kubenswrapper[4580]: I0321 05:27:12.883434 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3792a3a51ef25bac12e4db66dedc3f0de381993d0c487a2b047b2ff54670fa8c"} err="failed to get container status \"3792a3a51ef25bac12e4db66dedc3f0de381993d0c487a2b047b2ff54670fa8c\": rpc error: code = NotFound desc = could not find container \"3792a3a51ef25bac12e4db66dedc3f0de381993d0c487a2b047b2ff54670fa8c\": container with ID starting with 3792a3a51ef25bac12e4db66dedc3f0de381993d0c487a2b047b2ff54670fa8c not found: ID does not exist" Mar 21 05:27:13 crc kubenswrapper[4580]: I0321 05:27:13.628398 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" path="/var/lib/kubelet/pods/3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a/volumes" Mar 21 05:27:13 crc kubenswrapper[4580]: I0321 05:27:13.763215 4580 scope.go:117] "RemoveContainer" containerID="64d7ac80d9e70fa5f18bc4e5a09f88bf68eb7b9a6790c1dd7f0cbc5d458b689d" Mar 21 05:27:13 crc kubenswrapper[4580]: I0321 05:27:13.813190 4580 scope.go:117] "RemoveContainer" containerID="9d272dbb0e955717fe563a21e66e5f4aa48f6d4a9f8f73530e65ba4d4bc33129" Mar 21 05:27:17 crc kubenswrapper[4580]: I0321 05:27:17.820389 4580 generic.go:334] "Generic (PLEG): container finished" podID="b448b2a2-1171-4d9a-b28f-c0d8805134df" containerID="0bce82820e5c6ff29db077090708162d5ba3162bd180ede130fe3fe170c3e7fe" exitCode=0 Mar 21 05:27:17 crc kubenswrapper[4580]: I0321 05:27:17.820486 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" event={"ID":"b448b2a2-1171-4d9a-b28f-c0d8805134df","Type":"ContainerDied","Data":"0bce82820e5c6ff29db077090708162d5ba3162bd180ede130fe3fe170c3e7fe"} Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.242527 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.386492 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b448b2a2-1171-4d9a-b28f-c0d8805134df-ssh-key-openstack-edpm-ipam\") pod \"b448b2a2-1171-4d9a-b28f-c0d8805134df\" (UID: \"b448b2a2-1171-4d9a-b28f-c0d8805134df\") " Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.386701 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m65zh\" (UniqueName: \"kubernetes.io/projected/b448b2a2-1171-4d9a-b28f-c0d8805134df-kube-api-access-m65zh\") pod \"b448b2a2-1171-4d9a-b28f-c0d8805134df\" (UID: \"b448b2a2-1171-4d9a-b28f-c0d8805134df\") " Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.386869 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b448b2a2-1171-4d9a-b28f-c0d8805134df-inventory-0\") pod \"b448b2a2-1171-4d9a-b28f-c0d8805134df\" (UID: \"b448b2a2-1171-4d9a-b28f-c0d8805134df\") " Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.396037 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b448b2a2-1171-4d9a-b28f-c0d8805134df-kube-api-access-m65zh" (OuterVolumeSpecName: "kube-api-access-m65zh") pod "b448b2a2-1171-4d9a-b28f-c0d8805134df" (UID: "b448b2a2-1171-4d9a-b28f-c0d8805134df"). InnerVolumeSpecName "kube-api-access-m65zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.417728 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b448b2a2-1171-4d9a-b28f-c0d8805134df-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b448b2a2-1171-4d9a-b28f-c0d8805134df" (UID: "b448b2a2-1171-4d9a-b28f-c0d8805134df"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.420907 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b448b2a2-1171-4d9a-b28f-c0d8805134df-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "b448b2a2-1171-4d9a-b28f-c0d8805134df" (UID: "b448b2a2-1171-4d9a-b28f-c0d8805134df"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.489425 4580 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b448b2a2-1171-4d9a-b28f-c0d8805134df-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.489464 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b448b2a2-1171-4d9a-b28f-c0d8805134df-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.489476 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m65zh\" (UniqueName: \"kubernetes.io/projected/b448b2a2-1171-4d9a-b28f-c0d8805134df-kube-api-access-m65zh\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.842264 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" event={"ID":"b448b2a2-1171-4d9a-b28f-c0d8805134df","Type":"ContainerDied","Data":"13d4fc6901ed42a9559dc4a0d23fd087adb716b41d9da20a7e635ea82cdaca63"} Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.842305 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13d4fc6901ed42a9559dc4a0d23fd087adb716b41d9da20a7e635ea82cdaca63" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.842322 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxplm" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.934281 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr"] Mar 21 05:27:19 crc kubenswrapper[4580]: E0321 05:27:19.934708 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b448b2a2-1171-4d9a-b28f-c0d8805134df" containerName="ssh-known-hosts-edpm-deployment" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.934734 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b448b2a2-1171-4d9a-b28f-c0d8805134df" containerName="ssh-known-hosts-edpm-deployment" Mar 21 05:27:19 crc kubenswrapper[4580]: E0321 05:27:19.934758 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" containerName="extract-utilities" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.934769 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" containerName="extract-utilities" Mar 21 05:27:19 crc kubenswrapper[4580]: E0321 05:27:19.934820 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" containerName="extract-content" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.934830 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" containerName="extract-content" Mar 21 05:27:19 crc kubenswrapper[4580]: E0321 05:27:19.934857 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" containerName="registry-server" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.934863 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" containerName="registry-server" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.935081 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="3682a077-0a8f-4c9b-a8d3-7dc0ff16b32a" containerName="registry-server" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.935101 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="b448b2a2-1171-4d9a-b28f-c0d8805134df" containerName="ssh-known-hosts-edpm-deployment" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.935838 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.939503 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.939929 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.940055 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.948391 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:27:19 crc kubenswrapper[4580]: I0321 05:27:19.954609 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr"] Mar 21 05:27:20 crc kubenswrapper[4580]: I0321 05:27:20.106970 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30665d01-e41e-4e5e-ad25-f4430eb5866a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcqfr\" (UID: \"30665d01-e41e-4e5e-ad25-f4430eb5866a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" Mar 21 05:27:20 crc kubenswrapper[4580]: I0321 05:27:20.107219 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkc8c\" (UniqueName: \"kubernetes.io/projected/30665d01-e41e-4e5e-ad25-f4430eb5866a-kube-api-access-kkc8c\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcqfr\" (UID: \"30665d01-e41e-4e5e-ad25-f4430eb5866a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" Mar 21 05:27:20 crc kubenswrapper[4580]: I0321 05:27:20.107336 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30665d01-e41e-4e5e-ad25-f4430eb5866a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcqfr\" (UID: \"30665d01-e41e-4e5e-ad25-f4430eb5866a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" Mar 21 05:27:20 crc kubenswrapper[4580]: I0321 05:27:20.210234 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkc8c\" (UniqueName: \"kubernetes.io/projected/30665d01-e41e-4e5e-ad25-f4430eb5866a-kube-api-access-kkc8c\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcqfr\" (UID: \"30665d01-e41e-4e5e-ad25-f4430eb5866a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" Mar 21 05:27:20 crc kubenswrapper[4580]: I0321 05:27:20.210322 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30665d01-e41e-4e5e-ad25-f4430eb5866a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcqfr\" (UID: \"30665d01-e41e-4e5e-ad25-f4430eb5866a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" Mar 21 05:27:20 crc kubenswrapper[4580]: I0321 05:27:20.210448 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30665d01-e41e-4e5e-ad25-f4430eb5866a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcqfr\" (UID: \"30665d01-e41e-4e5e-ad25-f4430eb5866a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" Mar 21 05:27:20 crc kubenswrapper[4580]: I0321 05:27:20.218561 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30665d01-e41e-4e5e-ad25-f4430eb5866a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcqfr\" (UID: \"30665d01-e41e-4e5e-ad25-f4430eb5866a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" Mar 21 05:27:20 crc kubenswrapper[4580]: I0321 05:27:20.218630 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30665d01-e41e-4e5e-ad25-f4430eb5866a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcqfr\" (UID: \"30665d01-e41e-4e5e-ad25-f4430eb5866a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" Mar 21 05:27:20 crc kubenswrapper[4580]: I0321 05:27:20.229864 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkc8c\" (UniqueName: \"kubernetes.io/projected/30665d01-e41e-4e5e-ad25-f4430eb5866a-kube-api-access-kkc8c\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcqfr\" (UID: \"30665d01-e41e-4e5e-ad25-f4430eb5866a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" Mar 21 05:27:20 crc kubenswrapper[4580]: I0321 05:27:20.253120 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" Mar 21 05:27:20 crc kubenswrapper[4580]: I0321 05:27:20.744647 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr"] Mar 21 05:27:20 crc kubenswrapper[4580]: I0321 05:27:20.850740 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" event={"ID":"30665d01-e41e-4e5e-ad25-f4430eb5866a","Type":"ContainerStarted","Data":"6f26efe2c6f167f3c127909dd23a10be0d9099f5342c1e795e183113774924ec"} Mar 21 05:27:21 crc kubenswrapper[4580]: I0321 05:27:21.860017 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" event={"ID":"30665d01-e41e-4e5e-ad25-f4430eb5866a","Type":"ContainerStarted","Data":"98fea1c051a6a32c7ed59370a532282502150e5bfe718fa7a3569ac2d8fecb5f"} Mar 21 05:27:21 crc kubenswrapper[4580]: I0321 05:27:21.885055 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" podStartSLOduration=2.451026787 podStartE2EDuration="2.88503782s" podCreationTimestamp="2026-03-21 05:27:19 +0000 UTC" firstStartedPulling="2026-03-21 05:27:20.7509717 +0000 UTC m=+2145.833555328" lastFinishedPulling="2026-03-21 05:27:21.184982743 +0000 UTC m=+2146.267566361" observedRunningTime="2026-03-21 05:27:21.87653686 +0000 UTC m=+2146.959120498" watchObservedRunningTime="2026-03-21 05:27:21.88503782 +0000 UTC m=+2146.967621448" Mar 21 05:27:28 crc kubenswrapper[4580]: I0321 05:27:28.929178 4580 generic.go:334] "Generic (PLEG): container finished" podID="30665d01-e41e-4e5e-ad25-f4430eb5866a" containerID="98fea1c051a6a32c7ed59370a532282502150e5bfe718fa7a3569ac2d8fecb5f" exitCode=0 Mar 21 05:27:28 crc kubenswrapper[4580]: I0321 05:27:28.929255 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" event={"ID":"30665d01-e41e-4e5e-ad25-f4430eb5866a","Type":"ContainerDied","Data":"98fea1c051a6a32c7ed59370a532282502150e5bfe718fa7a3569ac2d8fecb5f"} Mar 21 05:27:30 crc kubenswrapper[4580]: I0321 05:27:30.348585 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" Mar 21 05:27:30 crc kubenswrapper[4580]: I0321 05:27:30.508979 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30665d01-e41e-4e5e-ad25-f4430eb5866a-ssh-key-openstack-edpm-ipam\") pod \"30665d01-e41e-4e5e-ad25-f4430eb5866a\" (UID: \"30665d01-e41e-4e5e-ad25-f4430eb5866a\") " Mar 21 05:27:30 crc kubenswrapper[4580]: I0321 05:27:30.509153 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30665d01-e41e-4e5e-ad25-f4430eb5866a-inventory\") pod \"30665d01-e41e-4e5e-ad25-f4430eb5866a\" (UID: \"30665d01-e41e-4e5e-ad25-f4430eb5866a\") " Mar 21 05:27:30 crc kubenswrapper[4580]: I0321 05:27:30.509225 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkc8c\" (UniqueName: \"kubernetes.io/projected/30665d01-e41e-4e5e-ad25-f4430eb5866a-kube-api-access-kkc8c\") pod \"30665d01-e41e-4e5e-ad25-f4430eb5866a\" (UID: \"30665d01-e41e-4e5e-ad25-f4430eb5866a\") " Mar 21 05:27:30 crc kubenswrapper[4580]: I0321 05:27:30.514992 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30665d01-e41e-4e5e-ad25-f4430eb5866a-kube-api-access-kkc8c" (OuterVolumeSpecName: "kube-api-access-kkc8c") pod "30665d01-e41e-4e5e-ad25-f4430eb5866a" (UID: "30665d01-e41e-4e5e-ad25-f4430eb5866a"). InnerVolumeSpecName "kube-api-access-kkc8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:27:30 crc kubenswrapper[4580]: I0321 05:27:30.541239 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30665d01-e41e-4e5e-ad25-f4430eb5866a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "30665d01-e41e-4e5e-ad25-f4430eb5866a" (UID: "30665d01-e41e-4e5e-ad25-f4430eb5866a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:27:30 crc kubenswrapper[4580]: I0321 05:27:30.543860 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30665d01-e41e-4e5e-ad25-f4430eb5866a-inventory" (OuterVolumeSpecName: "inventory") pod "30665d01-e41e-4e5e-ad25-f4430eb5866a" (UID: "30665d01-e41e-4e5e-ad25-f4430eb5866a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:27:30 crc kubenswrapper[4580]: I0321 05:27:30.611604 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30665d01-e41e-4e5e-ad25-f4430eb5866a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:30 crc kubenswrapper[4580]: I0321 05:27:30.611644 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30665d01-e41e-4e5e-ad25-f4430eb5866a-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:30 crc kubenswrapper[4580]: I0321 05:27:30.611656 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkc8c\" (UniqueName: \"kubernetes.io/projected/30665d01-e41e-4e5e-ad25-f4430eb5866a-kube-api-access-kkc8c\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:30 crc kubenswrapper[4580]: I0321 05:27:30.948930 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" event={"ID":"30665d01-e41e-4e5e-ad25-f4430eb5866a","Type":"ContainerDied","Data":"6f26efe2c6f167f3c127909dd23a10be0d9099f5342c1e795e183113774924ec"} Mar 21 05:27:30 crc kubenswrapper[4580]: I0321 05:27:30.948989 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f26efe2c6f167f3c127909dd23a10be0d9099f5342c1e795e183113774924ec" Mar 21 05:27:30 crc kubenswrapper[4580]: I0321 05:27:30.949028 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcqfr" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.031238 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn"] Mar 21 05:27:31 crc kubenswrapper[4580]: E0321 05:27:31.031763 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30665d01-e41e-4e5e-ad25-f4430eb5866a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.031816 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="30665d01-e41e-4e5e-ad25-f4430eb5866a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.032058 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="30665d01-e41e-4e5e-ad25-f4430eb5866a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.032864 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.042325 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn"] Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.043663 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.043964 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.048657 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.049156 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.122248 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bb5e8570-68a2-47c9-bd31-4be0389bd713-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn\" (UID: \"bb5e8570-68a2-47c9-bd31-4be0389bd713\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.122289 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb5e8570-68a2-47c9-bd31-4be0389bd713-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn\" (UID: \"bb5e8570-68a2-47c9-bd31-4be0389bd713\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.122355 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm5kg\" (UniqueName: \"kubernetes.io/projected/bb5e8570-68a2-47c9-bd31-4be0389bd713-kube-api-access-sm5kg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn\" (UID: \"bb5e8570-68a2-47c9-bd31-4be0389bd713\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.224401 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bb5e8570-68a2-47c9-bd31-4be0389bd713-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn\" (UID: \"bb5e8570-68a2-47c9-bd31-4be0389bd713\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.224452 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb5e8570-68a2-47c9-bd31-4be0389bd713-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn\" (UID: \"bb5e8570-68a2-47c9-bd31-4be0389bd713\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.224521 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm5kg\" (UniqueName: \"kubernetes.io/projected/bb5e8570-68a2-47c9-bd31-4be0389bd713-kube-api-access-sm5kg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn\" (UID: \"bb5e8570-68a2-47c9-bd31-4be0389bd713\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.228648 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb5e8570-68a2-47c9-bd31-4be0389bd713-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn\" (UID: \"bb5e8570-68a2-47c9-bd31-4be0389bd713\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.229543 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bb5e8570-68a2-47c9-bd31-4be0389bd713-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn\" (UID: \"bb5e8570-68a2-47c9-bd31-4be0389bd713\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.241132 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm5kg\" (UniqueName: \"kubernetes.io/projected/bb5e8570-68a2-47c9-bd31-4be0389bd713-kube-api-access-sm5kg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn\" (UID: \"bb5e8570-68a2-47c9-bd31-4be0389bd713\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.348645 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.856833 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn"] Mar 21 05:27:31 crc kubenswrapper[4580]: I0321 05:27:31.957853 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" event={"ID":"bb5e8570-68a2-47c9-bd31-4be0389bd713","Type":"ContainerStarted","Data":"e7822ad371575993ad96b807432203a88b590287b044790bf91e1dce1b92b815"} Mar 21 05:27:32 crc kubenswrapper[4580]: I0321 05:27:32.969540 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" event={"ID":"bb5e8570-68a2-47c9-bd31-4be0389bd713","Type":"ContainerStarted","Data":"99ffed07de607decd7df1b0f3f2794d5a8a781e25fb177c5b995d3593c166a16"} Mar 21 05:27:32 crc kubenswrapper[4580]: I0321 05:27:32.989395 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" podStartSLOduration=1.4450247489999999 podStartE2EDuration="1.989369377s" podCreationTimestamp="2026-03-21 05:27:31 +0000 UTC" firstStartedPulling="2026-03-21 05:27:31.873954779 +0000 UTC m=+2156.956538407" lastFinishedPulling="2026-03-21 05:27:32.418299407 +0000 UTC m=+2157.500883035" observedRunningTime="2026-03-21 05:27:32.984824674 +0000 UTC m=+2158.067408322" watchObservedRunningTime="2026-03-21 05:27:32.989369377 +0000 UTC m=+2158.071953005" Mar 21 05:27:42 crc kubenswrapper[4580]: I0321 05:27:42.055397 4580 generic.go:334] "Generic (PLEG): container finished" podID="bb5e8570-68a2-47c9-bd31-4be0389bd713" containerID="99ffed07de607decd7df1b0f3f2794d5a8a781e25fb177c5b995d3593c166a16" exitCode=0 Mar 21 05:27:42 crc kubenswrapper[4580]: I0321 05:27:42.055500 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" event={"ID":"bb5e8570-68a2-47c9-bd31-4be0389bd713","Type":"ContainerDied","Data":"99ffed07de607decd7df1b0f3f2794d5a8a781e25fb177c5b995d3593c166a16"} Mar 21 05:27:43 crc kubenswrapper[4580]: I0321 05:27:43.458360 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" Mar 21 05:27:43 crc kubenswrapper[4580]: I0321 05:27:43.555521 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm5kg\" (UniqueName: \"kubernetes.io/projected/bb5e8570-68a2-47c9-bd31-4be0389bd713-kube-api-access-sm5kg\") pod \"bb5e8570-68a2-47c9-bd31-4be0389bd713\" (UID: \"bb5e8570-68a2-47c9-bd31-4be0389bd713\") " Mar 21 05:27:43 crc kubenswrapper[4580]: I0321 05:27:43.555656 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb5e8570-68a2-47c9-bd31-4be0389bd713-inventory\") pod \"bb5e8570-68a2-47c9-bd31-4be0389bd713\" (UID: \"bb5e8570-68a2-47c9-bd31-4be0389bd713\") " Mar 21 05:27:43 crc kubenswrapper[4580]: I0321 05:27:43.555881 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bb5e8570-68a2-47c9-bd31-4be0389bd713-ssh-key-openstack-edpm-ipam\") pod \"bb5e8570-68a2-47c9-bd31-4be0389bd713\" (UID: \"bb5e8570-68a2-47c9-bd31-4be0389bd713\") " Mar 21 05:27:43 crc kubenswrapper[4580]: I0321 05:27:43.581963 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb5e8570-68a2-47c9-bd31-4be0389bd713-kube-api-access-sm5kg" (OuterVolumeSpecName: "kube-api-access-sm5kg") pod "bb5e8570-68a2-47c9-bd31-4be0389bd713" (UID: "bb5e8570-68a2-47c9-bd31-4be0389bd713"). InnerVolumeSpecName "kube-api-access-sm5kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:27:43 crc kubenswrapper[4580]: I0321 05:27:43.584825 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb5e8570-68a2-47c9-bd31-4be0389bd713-inventory" (OuterVolumeSpecName: "inventory") pod "bb5e8570-68a2-47c9-bd31-4be0389bd713" (UID: "bb5e8570-68a2-47c9-bd31-4be0389bd713"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:27:43 crc kubenswrapper[4580]: I0321 05:27:43.594887 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb5e8570-68a2-47c9-bd31-4be0389bd713-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bb5e8570-68a2-47c9-bd31-4be0389bd713" (UID: "bb5e8570-68a2-47c9-bd31-4be0389bd713"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:27:43 crc kubenswrapper[4580]: I0321 05:27:43.658210 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bb5e8570-68a2-47c9-bd31-4be0389bd713-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:43 crc kubenswrapper[4580]: I0321 05:27:43.658383 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm5kg\" (UniqueName: \"kubernetes.io/projected/bb5e8570-68a2-47c9-bd31-4be0389bd713-kube-api-access-sm5kg\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:43 crc kubenswrapper[4580]: I0321 05:27:43.658455 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb5e8570-68a2-47c9-bd31-4be0389bd713-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.094695 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" event={"ID":"bb5e8570-68a2-47c9-bd31-4be0389bd713","Type":"ContainerDied","Data":"e7822ad371575993ad96b807432203a88b590287b044790bf91e1dce1b92b815"} Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.094966 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7822ad371575993ad96b807432203a88b590287b044790bf91e1dce1b92b815" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.094895 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.173871 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5"] Mar 21 05:27:44 crc kubenswrapper[4580]: E0321 05:27:44.174358 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb5e8570-68a2-47c9-bd31-4be0389bd713" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.174384 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb5e8570-68a2-47c9-bd31-4be0389bd713" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.174624 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb5e8570-68a2-47c9-bd31-4be0389bd713" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.175422 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.180490 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.180680 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.180860 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.181051 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.181187 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.181268 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.181340 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.181407 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.192899 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5"] Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.268961 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.269263 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.269286 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4775w\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-kube-api-access-4775w\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.269323 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.269342 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.269376 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.269395 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.269410 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.269443 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.269482 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.269512 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.269537 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.269558 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.269581 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.371385 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.371441 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.371464 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4775w\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-kube-api-access-4775w\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.371499 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.371516 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.371554 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.371577 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.371597 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.371629 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.371667 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.371691 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.371716 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.371737 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.371759 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.378185 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.379328 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.380016 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.380749 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.381256 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.382243 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.382288 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.383216 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.383216 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.383422 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.384700 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.388620 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.391625 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4775w\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-kube-api-access-4775w\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.396426 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7skn5\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:44 crc kubenswrapper[4580]: I0321 05:27:44.503817 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:27:45 crc kubenswrapper[4580]: I0321 05:27:45.093330 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5"] Mar 21 05:27:45 crc kubenswrapper[4580]: I0321 05:27:45.109925 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" event={"ID":"72b35ded-99db-471e-b265-5e1e0467af49","Type":"ContainerStarted","Data":"c85fe1558a8cf089000c773d1434e750744c00206babed86403997e4ec816f40"} Mar 21 05:27:46 crc kubenswrapper[4580]: I0321 05:27:46.121903 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" event={"ID":"72b35ded-99db-471e-b265-5e1e0467af49","Type":"ContainerStarted","Data":"91b861acfcf72b12d97120cccbbe3706c22a263edc4fd005d1472296eae792f8"} Mar 21 05:28:00 crc kubenswrapper[4580]: I0321 05:28:00.143164 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" podStartSLOduration=15.660243077 podStartE2EDuration="16.143146609s" podCreationTimestamp="2026-03-21 05:27:44 +0000 UTC" firstStartedPulling="2026-03-21 05:27:45.102092362 +0000 UTC m=+2170.184675990" lastFinishedPulling="2026-03-21 05:27:45.584995894 +0000 UTC m=+2170.667579522" observedRunningTime="2026-03-21 05:27:46.150678316 +0000 UTC m=+2171.233261954" watchObservedRunningTime="2026-03-21 05:28:00.143146609 +0000 UTC m=+2185.225730237" Mar 21 05:28:00 crc kubenswrapper[4580]: I0321 05:28:00.145324 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567848-zmvbq"] Mar 21 05:28:00 crc kubenswrapper[4580]: I0321 05:28:00.146576 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-zmvbq" Mar 21 05:28:00 crc kubenswrapper[4580]: I0321 05:28:00.148654 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:28:00 crc kubenswrapper[4580]: I0321 05:28:00.148711 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:28:00 crc kubenswrapper[4580]: I0321 05:28:00.148720 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:28:00 crc kubenswrapper[4580]: I0321 05:28:00.160246 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-zmvbq"] Mar 21 05:28:00 crc kubenswrapper[4580]: I0321 05:28:00.197529 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztvj7\" (UniqueName: \"kubernetes.io/projected/65db5d38-ca46-4fd6-999e-508bbb7e49b4-kube-api-access-ztvj7\") pod \"auto-csr-approver-29567848-zmvbq\" (UID: \"65db5d38-ca46-4fd6-999e-508bbb7e49b4\") " pod="openshift-infra/auto-csr-approver-29567848-zmvbq" Mar 21 05:28:00 crc kubenswrapper[4580]: I0321 05:28:00.299175 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztvj7\" (UniqueName: \"kubernetes.io/projected/65db5d38-ca46-4fd6-999e-508bbb7e49b4-kube-api-access-ztvj7\") pod \"auto-csr-approver-29567848-zmvbq\" (UID: \"65db5d38-ca46-4fd6-999e-508bbb7e49b4\") " pod="openshift-infra/auto-csr-approver-29567848-zmvbq" Mar 21 05:28:00 crc kubenswrapper[4580]: I0321 05:28:00.318017 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztvj7\" (UniqueName: \"kubernetes.io/projected/65db5d38-ca46-4fd6-999e-508bbb7e49b4-kube-api-access-ztvj7\") pod \"auto-csr-approver-29567848-zmvbq\" (UID: \"65db5d38-ca46-4fd6-999e-508bbb7e49b4\") " pod="openshift-infra/auto-csr-approver-29567848-zmvbq" Mar 21 05:28:00 crc kubenswrapper[4580]: I0321 05:28:00.467255 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-zmvbq" Mar 21 05:28:00 crc kubenswrapper[4580]: I0321 05:28:00.953295 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-zmvbq"] Mar 21 05:28:01 crc kubenswrapper[4580]: I0321 05:28:01.234252 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567848-zmvbq" event={"ID":"65db5d38-ca46-4fd6-999e-508bbb7e49b4","Type":"ContainerStarted","Data":"2d7666cb73d81aee3420b8652b962bcab2c4c25aef640ab6badaae0dd12eae9e"} Mar 21 05:28:02 crc kubenswrapper[4580]: I0321 05:28:02.243691 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567848-zmvbq" event={"ID":"65db5d38-ca46-4fd6-999e-508bbb7e49b4","Type":"ContainerStarted","Data":"cb7b6fdf73ee5a020c64564651deb11e43abf4e21e4513f25c30c1ef33928fc0"} Mar 21 05:28:02 crc kubenswrapper[4580]: I0321 05:28:02.264034 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567848-zmvbq" podStartSLOduration=1.437592123 podStartE2EDuration="2.264008737s" podCreationTimestamp="2026-03-21 05:28:00 +0000 UTC" firstStartedPulling="2026-03-21 05:28:00.961957257 +0000 UTC m=+2186.044540885" lastFinishedPulling="2026-03-21 05:28:01.788373831 +0000 UTC m=+2186.870957499" observedRunningTime="2026-03-21 05:28:02.257627135 +0000 UTC m=+2187.340210763" watchObservedRunningTime="2026-03-21 05:28:02.264008737 +0000 UTC m=+2187.346592365" Mar 21 05:28:03 crc kubenswrapper[4580]: I0321 05:28:03.255287 4580 generic.go:334] "Generic (PLEG): container finished" podID="65db5d38-ca46-4fd6-999e-508bbb7e49b4" containerID="cb7b6fdf73ee5a020c64564651deb11e43abf4e21e4513f25c30c1ef33928fc0" exitCode=0 Mar 21 05:28:03 crc kubenswrapper[4580]: I0321 05:28:03.255337 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567848-zmvbq" event={"ID":"65db5d38-ca46-4fd6-999e-508bbb7e49b4","Type":"ContainerDied","Data":"cb7b6fdf73ee5a020c64564651deb11e43abf4e21e4513f25c30c1ef33928fc0"} Mar 21 05:28:04 crc kubenswrapper[4580]: I0321 05:28:04.573131 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-zmvbq" Mar 21 05:28:04 crc kubenswrapper[4580]: I0321 05:28:04.778819 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztvj7\" (UniqueName: \"kubernetes.io/projected/65db5d38-ca46-4fd6-999e-508bbb7e49b4-kube-api-access-ztvj7\") pod \"65db5d38-ca46-4fd6-999e-508bbb7e49b4\" (UID: \"65db5d38-ca46-4fd6-999e-508bbb7e49b4\") " Mar 21 05:28:04 crc kubenswrapper[4580]: I0321 05:28:04.784045 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65db5d38-ca46-4fd6-999e-508bbb7e49b4-kube-api-access-ztvj7" (OuterVolumeSpecName: "kube-api-access-ztvj7") pod "65db5d38-ca46-4fd6-999e-508bbb7e49b4" (UID: "65db5d38-ca46-4fd6-999e-508bbb7e49b4"). InnerVolumeSpecName "kube-api-access-ztvj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:28:04 crc kubenswrapper[4580]: I0321 05:28:04.880754 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztvj7\" (UniqueName: \"kubernetes.io/projected/65db5d38-ca46-4fd6-999e-508bbb7e49b4-kube-api-access-ztvj7\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:05 crc kubenswrapper[4580]: I0321 05:28:05.280510 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567848-zmvbq" event={"ID":"65db5d38-ca46-4fd6-999e-508bbb7e49b4","Type":"ContainerDied","Data":"2d7666cb73d81aee3420b8652b962bcab2c4c25aef640ab6badaae0dd12eae9e"} Mar 21 05:28:05 crc kubenswrapper[4580]: I0321 05:28:05.280556 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d7666cb73d81aee3420b8652b962bcab2c4c25aef640ab6badaae0dd12eae9e" Mar 21 05:28:05 crc kubenswrapper[4580]: I0321 05:28:05.280623 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-zmvbq" Mar 21 05:28:05 crc kubenswrapper[4580]: I0321 05:28:05.336550 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-7t89q"] Mar 21 05:28:05 crc kubenswrapper[4580]: I0321 05:28:05.354673 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-7t89q"] Mar 21 05:28:05 crc kubenswrapper[4580]: I0321 05:28:05.632394 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b750a30f-e9aa-4e9d-950c-66ee25d90139" path="/var/lib/kubelet/pods/b750a30f-e9aa-4e9d-950c-66ee25d90139/volumes" Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.228633 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gkkds"] Mar 21 05:28:11 crc kubenswrapper[4580]: E0321 05:28:11.230864 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65db5d38-ca46-4fd6-999e-508bbb7e49b4" containerName="oc" Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.230894 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="65db5d38-ca46-4fd6-999e-508bbb7e49b4" containerName="oc" Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.231344 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="65db5d38-ca46-4fd6-999e-508bbb7e49b4" containerName="oc" Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.232995 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.251287 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gkkds"] Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.397873 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0298364a-fc4c-4e99-b7ae-a74326d167b6-catalog-content\") pod \"certified-operators-gkkds\" (UID: \"0298364a-fc4c-4e99-b7ae-a74326d167b6\") " pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.398293 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcf2h\" (UniqueName: \"kubernetes.io/projected/0298364a-fc4c-4e99-b7ae-a74326d167b6-kube-api-access-vcf2h\") pod \"certified-operators-gkkds\" (UID: \"0298364a-fc4c-4e99-b7ae-a74326d167b6\") " pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.398421 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0298364a-fc4c-4e99-b7ae-a74326d167b6-utilities\") pod \"certified-operators-gkkds\" (UID: \"0298364a-fc4c-4e99-b7ae-a74326d167b6\") " pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.499841 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0298364a-fc4c-4e99-b7ae-a74326d167b6-utilities\") pod \"certified-operators-gkkds\" (UID: \"0298364a-fc4c-4e99-b7ae-a74326d167b6\") " pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.500030 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0298364a-fc4c-4e99-b7ae-a74326d167b6-catalog-content\") pod \"certified-operators-gkkds\" (UID: \"0298364a-fc4c-4e99-b7ae-a74326d167b6\") " pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.500092 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcf2h\" (UniqueName: \"kubernetes.io/projected/0298364a-fc4c-4e99-b7ae-a74326d167b6-kube-api-access-vcf2h\") pod \"certified-operators-gkkds\" (UID: \"0298364a-fc4c-4e99-b7ae-a74326d167b6\") " pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.500444 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0298364a-fc4c-4e99-b7ae-a74326d167b6-utilities\") pod \"certified-operators-gkkds\" (UID: \"0298364a-fc4c-4e99-b7ae-a74326d167b6\") " pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.500457 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0298364a-fc4c-4e99-b7ae-a74326d167b6-catalog-content\") pod \"certified-operators-gkkds\" (UID: \"0298364a-fc4c-4e99-b7ae-a74326d167b6\") " pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.526799 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcf2h\" (UniqueName: \"kubernetes.io/projected/0298364a-fc4c-4e99-b7ae-a74326d167b6-kube-api-access-vcf2h\") pod \"certified-operators-gkkds\" (UID: \"0298364a-fc4c-4e99-b7ae-a74326d167b6\") " pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.552977 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:11 crc kubenswrapper[4580]: I0321 05:28:11.959507 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gkkds"] Mar 21 05:28:12 crc kubenswrapper[4580]: I0321 05:28:12.333949 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkkds" event={"ID":"0298364a-fc4c-4e99-b7ae-a74326d167b6","Type":"ContainerStarted","Data":"de9724f5a1eec33f3cfe0d1bc1a4602ac22d23d48477bac6b11be1fb714c494f"} Mar 21 05:28:12 crc kubenswrapper[4580]: I0321 05:28:12.334004 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkkds" event={"ID":"0298364a-fc4c-4e99-b7ae-a74326d167b6","Type":"ContainerStarted","Data":"1d7d42066c230e6496b034b438d4544dc62c044959985a8b5fff7377301c1fc8"} Mar 21 05:28:13 crc kubenswrapper[4580]: I0321 05:28:13.344114 4580 generic.go:334] "Generic (PLEG): container finished" podID="0298364a-fc4c-4e99-b7ae-a74326d167b6" containerID="de9724f5a1eec33f3cfe0d1bc1a4602ac22d23d48477bac6b11be1fb714c494f" exitCode=0 Mar 21 05:28:13 crc kubenswrapper[4580]: I0321 05:28:13.344180 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkkds" event={"ID":"0298364a-fc4c-4e99-b7ae-a74326d167b6","Type":"ContainerDied","Data":"de9724f5a1eec33f3cfe0d1bc1a4602ac22d23d48477bac6b11be1fb714c494f"} Mar 21 05:28:13 crc kubenswrapper[4580]: I0321 05:28:13.909112 4580 scope.go:117] "RemoveContainer" containerID="d39090e5cec9269bdce8924e7aaa9430a8ca451cd67cff60d8c78ac08d6de8eb" Mar 21 05:28:15 crc kubenswrapper[4580]: I0321 05:28:15.360794 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkkds" event={"ID":"0298364a-fc4c-4e99-b7ae-a74326d167b6","Type":"ContainerStarted","Data":"0baf00f1aa3cb6e7ea33d455588898fb2939c88244018d333d1bbcef16356091"} Mar 21 05:28:17 crc kubenswrapper[4580]: I0321 05:28:17.378305 4580 generic.go:334] "Generic (PLEG): container finished" podID="0298364a-fc4c-4e99-b7ae-a74326d167b6" containerID="0baf00f1aa3cb6e7ea33d455588898fb2939c88244018d333d1bbcef16356091" exitCode=0 Mar 21 05:28:17 crc kubenswrapper[4580]: I0321 05:28:17.378392 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkkds" event={"ID":"0298364a-fc4c-4e99-b7ae-a74326d167b6","Type":"ContainerDied","Data":"0baf00f1aa3cb6e7ea33d455588898fb2939c88244018d333d1bbcef16356091"} Mar 21 05:28:18 crc kubenswrapper[4580]: I0321 05:28:18.394068 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkkds" event={"ID":"0298364a-fc4c-4e99-b7ae-a74326d167b6","Type":"ContainerStarted","Data":"f7efba2c1c376fc28b57a87ba151246198571d3729768f898d72d1454fd024a3"} Mar 21 05:28:18 crc kubenswrapper[4580]: I0321 05:28:18.438509 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gkkds" podStartSLOduration=2.878327594 podStartE2EDuration="7.438473717s" podCreationTimestamp="2026-03-21 05:28:11 +0000 UTC" firstStartedPulling="2026-03-21 05:28:13.34706158 +0000 UTC m=+2198.429645208" lastFinishedPulling="2026-03-21 05:28:17.907207703 +0000 UTC m=+2202.989791331" observedRunningTime="2026-03-21 05:28:18.41929772 +0000 UTC m=+2203.501881378" watchObservedRunningTime="2026-03-21 05:28:18.438473717 +0000 UTC m=+2203.521057345" Mar 21 05:28:21 crc kubenswrapper[4580]: I0321 05:28:21.553753 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:21 crc kubenswrapper[4580]: I0321 05:28:21.554165 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:21 crc kubenswrapper[4580]: I0321 05:28:21.613587 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:25 crc kubenswrapper[4580]: I0321 05:28:25.448302 4580 generic.go:334] "Generic (PLEG): container finished" podID="72b35ded-99db-471e-b265-5e1e0467af49" containerID="91b861acfcf72b12d97120cccbbe3706c22a263edc4fd005d1472296eae792f8" exitCode=0 Mar 21 05:28:25 crc kubenswrapper[4580]: I0321 05:28:25.448377 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" event={"ID":"72b35ded-99db-471e-b265-5e1e0467af49","Type":"ContainerDied","Data":"91b861acfcf72b12d97120cccbbe3706c22a263edc4fd005d1472296eae792f8"} Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.844414 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.968845 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-ovn-default-certs-0\") pod \"72b35ded-99db-471e-b265-5e1e0467af49\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.969361 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-ssh-key-openstack-edpm-ipam\") pod \"72b35ded-99db-471e-b265-5e1e0467af49\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.969404 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-nova-combined-ca-bundle\") pod \"72b35ded-99db-471e-b265-5e1e0467af49\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.969559 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"72b35ded-99db-471e-b265-5e1e0467af49\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.969626 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-neutron-metadata-combined-ca-bundle\") pod \"72b35ded-99db-471e-b265-5e1e0467af49\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.969667 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"72b35ded-99db-471e-b265-5e1e0467af49\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.969704 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4775w\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-kube-api-access-4775w\") pod \"72b35ded-99db-471e-b265-5e1e0467af49\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.969736 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-repo-setup-combined-ca-bundle\") pod \"72b35ded-99db-471e-b265-5e1e0467af49\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.969765 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"72b35ded-99db-471e-b265-5e1e0467af49\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.969807 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-bootstrap-combined-ca-bundle\") pod \"72b35ded-99db-471e-b265-5e1e0467af49\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.969833 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-telemetry-combined-ca-bundle\") pod \"72b35ded-99db-471e-b265-5e1e0467af49\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.969929 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-inventory\") pod \"72b35ded-99db-471e-b265-5e1e0467af49\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.970340 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-libvirt-combined-ca-bundle\") pod \"72b35ded-99db-471e-b265-5e1e0467af49\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.970383 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-ovn-combined-ca-bundle\") pod \"72b35ded-99db-471e-b265-5e1e0467af49\" (UID: \"72b35ded-99db-471e-b265-5e1e0467af49\") " Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.977710 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "72b35ded-99db-471e-b265-5e1e0467af49" (UID: "72b35ded-99db-471e-b265-5e1e0467af49"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.978980 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-kube-api-access-4775w" (OuterVolumeSpecName: "kube-api-access-4775w") pod "72b35ded-99db-471e-b265-5e1e0467af49" (UID: "72b35ded-99db-471e-b265-5e1e0467af49"). InnerVolumeSpecName "kube-api-access-4775w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.979337 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "72b35ded-99db-471e-b265-5e1e0467af49" (UID: "72b35ded-99db-471e-b265-5e1e0467af49"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.980911 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "72b35ded-99db-471e-b265-5e1e0467af49" (UID: "72b35ded-99db-471e-b265-5e1e0467af49"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.980989 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "72b35ded-99db-471e-b265-5e1e0467af49" (UID: "72b35ded-99db-471e-b265-5e1e0467af49"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.982638 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "72b35ded-99db-471e-b265-5e1e0467af49" (UID: "72b35ded-99db-471e-b265-5e1e0467af49"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.983371 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "72b35ded-99db-471e-b265-5e1e0467af49" (UID: "72b35ded-99db-471e-b265-5e1e0467af49"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.985177 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "72b35ded-99db-471e-b265-5e1e0467af49" (UID: "72b35ded-99db-471e-b265-5e1e0467af49"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.986308 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "72b35ded-99db-471e-b265-5e1e0467af49" (UID: "72b35ded-99db-471e-b265-5e1e0467af49"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.988642 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "72b35ded-99db-471e-b265-5e1e0467af49" (UID: "72b35ded-99db-471e-b265-5e1e0467af49"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.990243 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "72b35ded-99db-471e-b265-5e1e0467af49" (UID: "72b35ded-99db-471e-b265-5e1e0467af49"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:28:26 crc kubenswrapper[4580]: I0321 05:28:26.991509 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "72b35ded-99db-471e-b265-5e1e0467af49" (UID: "72b35ded-99db-471e-b265-5e1e0467af49"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.009985 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "72b35ded-99db-471e-b265-5e1e0467af49" (UID: "72b35ded-99db-471e-b265-5e1e0467af49"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.025153 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-inventory" (OuterVolumeSpecName: "inventory") pod "72b35ded-99db-471e-b265-5e1e0467af49" (UID: "72b35ded-99db-471e-b265-5e1e0467af49"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.073037 4580 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.073076 4580 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.073091 4580 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.073104 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4775w\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-kube-api-access-4775w\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.073116 4580 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.073129 4580 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.073140 4580 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.073151 4580 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.073160 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.073169 4580 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.073181 4580 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.073189 4580 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/72b35ded-99db-471e-b265-5e1e0467af49-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.073198 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.073206 4580 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b35ded-99db-471e-b265-5e1e0467af49-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.468347 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" event={"ID":"72b35ded-99db-471e-b265-5e1e0467af49","Type":"ContainerDied","Data":"c85fe1558a8cf089000c773d1434e750744c00206babed86403997e4ec816f40"} Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.468394 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c85fe1558a8cf089000c773d1434e750744c00206babed86403997e4ec816f40" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.468483 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7skn5" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.598317 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj"] Mar 21 05:28:27 crc kubenswrapper[4580]: E0321 05:28:27.598743 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b35ded-99db-471e-b265-5e1e0467af49" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.598769 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b35ded-99db-471e-b265-5e1e0467af49" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.598999 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b35ded-99db-471e-b265-5e1e0467af49" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.599577 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.602419 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.602449 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.602436 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.603653 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.606120 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.613945 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj"] Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.684842 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v78pk\" (UniqueName: \"kubernetes.io/projected/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-kube-api-access-v78pk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-j2klj\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.684909 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-j2klj\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.684928 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-j2klj\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.684982 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-j2klj\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.685024 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-j2klj\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.787153 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-j2klj\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.787200 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-j2klj\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.787259 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-j2klj\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.787318 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-j2klj\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.787416 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v78pk\" (UniqueName: \"kubernetes.io/projected/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-kube-api-access-v78pk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-j2klj\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.788239 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-j2klj\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.791541 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-j2klj\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.794241 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-j2klj\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.798069 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-j2klj\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.805194 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v78pk\" (UniqueName: \"kubernetes.io/projected/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-kube-api-access-v78pk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-j2klj\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:27 crc kubenswrapper[4580]: I0321 05:28:27.917578 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:28:28 crc kubenswrapper[4580]: I0321 05:28:28.467877 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj"] Mar 21 05:28:29 crc kubenswrapper[4580]: I0321 05:28:29.500089 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" event={"ID":"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72","Type":"ContainerStarted","Data":"f6b66a52d85e2e967f2132e5440c3ec9765a7bad220b8029d23335cb92eca2d0"} Mar 21 05:28:29 crc kubenswrapper[4580]: I0321 05:28:29.500409 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" event={"ID":"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72","Type":"ContainerStarted","Data":"dc4c88ca94f6e27d04953844fdd5c0a807e111a31e471af309b374f7184e484c"} Mar 21 05:28:29 crc kubenswrapper[4580]: I0321 05:28:29.540151 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" podStartSLOduration=2.039125041 podStartE2EDuration="2.540120389s" podCreationTimestamp="2026-03-21 05:28:27 +0000 UTC" firstStartedPulling="2026-03-21 05:28:28.524823903 +0000 UTC m=+2213.607407541" lastFinishedPulling="2026-03-21 05:28:29.025819261 +0000 UTC m=+2214.108402889" observedRunningTime="2026-03-21 05:28:29.524695303 +0000 UTC m=+2214.607278951" watchObservedRunningTime="2026-03-21 05:28:29.540120389 +0000 UTC m=+2214.622704017" Mar 21 05:28:31 crc kubenswrapper[4580]: I0321 05:28:31.599364 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:31 crc kubenswrapper[4580]: I0321 05:28:31.654257 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gkkds"] Mar 21 05:28:32 crc kubenswrapper[4580]: I0321 05:28:32.524702 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gkkds" podUID="0298364a-fc4c-4e99-b7ae-a74326d167b6" containerName="registry-server" containerID="cri-o://f7efba2c1c376fc28b57a87ba151246198571d3729768f898d72d1454fd024a3" gracePeriod=2 Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.004890 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.117043 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0298364a-fc4c-4e99-b7ae-a74326d167b6-utilities\") pod \"0298364a-fc4c-4e99-b7ae-a74326d167b6\" (UID: \"0298364a-fc4c-4e99-b7ae-a74326d167b6\") " Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.117279 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0298364a-fc4c-4e99-b7ae-a74326d167b6-catalog-content\") pod \"0298364a-fc4c-4e99-b7ae-a74326d167b6\" (UID: \"0298364a-fc4c-4e99-b7ae-a74326d167b6\") " Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.117310 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcf2h\" (UniqueName: \"kubernetes.io/projected/0298364a-fc4c-4e99-b7ae-a74326d167b6-kube-api-access-vcf2h\") pod \"0298364a-fc4c-4e99-b7ae-a74326d167b6\" (UID: \"0298364a-fc4c-4e99-b7ae-a74326d167b6\") " Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.118184 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0298364a-fc4c-4e99-b7ae-a74326d167b6-utilities" (OuterVolumeSpecName: "utilities") pod "0298364a-fc4c-4e99-b7ae-a74326d167b6" (UID: "0298364a-fc4c-4e99-b7ae-a74326d167b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.130822 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0298364a-fc4c-4e99-b7ae-a74326d167b6-kube-api-access-vcf2h" (OuterVolumeSpecName: "kube-api-access-vcf2h") pod "0298364a-fc4c-4e99-b7ae-a74326d167b6" (UID: "0298364a-fc4c-4e99-b7ae-a74326d167b6"). InnerVolumeSpecName "kube-api-access-vcf2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.168212 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0298364a-fc4c-4e99-b7ae-a74326d167b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0298364a-fc4c-4e99-b7ae-a74326d167b6" (UID: "0298364a-fc4c-4e99-b7ae-a74326d167b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.219548 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0298364a-fc4c-4e99-b7ae-a74326d167b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.219580 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0298364a-fc4c-4e99-b7ae-a74326d167b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.219593 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcf2h\" (UniqueName: \"kubernetes.io/projected/0298364a-fc4c-4e99-b7ae-a74326d167b6-kube-api-access-vcf2h\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.536465 4580 generic.go:334] "Generic (PLEG): container finished" podID="0298364a-fc4c-4e99-b7ae-a74326d167b6" containerID="f7efba2c1c376fc28b57a87ba151246198571d3729768f898d72d1454fd024a3" exitCode=0 Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.536523 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkkds" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.536598 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkkds" event={"ID":"0298364a-fc4c-4e99-b7ae-a74326d167b6","Type":"ContainerDied","Data":"f7efba2c1c376fc28b57a87ba151246198571d3729768f898d72d1454fd024a3"} Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.536921 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkkds" event={"ID":"0298364a-fc4c-4e99-b7ae-a74326d167b6","Type":"ContainerDied","Data":"1d7d42066c230e6496b034b438d4544dc62c044959985a8b5fff7377301c1fc8"} Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.537024 4580 scope.go:117] "RemoveContainer" containerID="f7efba2c1c376fc28b57a87ba151246198571d3729768f898d72d1454fd024a3" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.558889 4580 scope.go:117] "RemoveContainer" containerID="0baf00f1aa3cb6e7ea33d455588898fb2939c88244018d333d1bbcef16356091" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.580831 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gkkds"] Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.590968 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gkkds"] Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.594279 4580 scope.go:117] "RemoveContainer" containerID="de9724f5a1eec33f3cfe0d1bc1a4602ac22d23d48477bac6b11be1fb714c494f" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.633281 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0298364a-fc4c-4e99-b7ae-a74326d167b6" path="/var/lib/kubelet/pods/0298364a-fc4c-4e99-b7ae-a74326d167b6/volumes" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.649349 4580 scope.go:117] "RemoveContainer" containerID="f7efba2c1c376fc28b57a87ba151246198571d3729768f898d72d1454fd024a3" Mar 21 05:28:33 crc kubenswrapper[4580]: E0321 05:28:33.649982 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7efba2c1c376fc28b57a87ba151246198571d3729768f898d72d1454fd024a3\": container with ID starting with f7efba2c1c376fc28b57a87ba151246198571d3729768f898d72d1454fd024a3 not found: ID does not exist" containerID="f7efba2c1c376fc28b57a87ba151246198571d3729768f898d72d1454fd024a3" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.650032 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7efba2c1c376fc28b57a87ba151246198571d3729768f898d72d1454fd024a3"} err="failed to get container status \"f7efba2c1c376fc28b57a87ba151246198571d3729768f898d72d1454fd024a3\": rpc error: code = NotFound desc = could not find container \"f7efba2c1c376fc28b57a87ba151246198571d3729768f898d72d1454fd024a3\": container with ID starting with f7efba2c1c376fc28b57a87ba151246198571d3729768f898d72d1454fd024a3 not found: ID does not exist" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.650068 4580 scope.go:117] "RemoveContainer" containerID="0baf00f1aa3cb6e7ea33d455588898fb2939c88244018d333d1bbcef16356091" Mar 21 05:28:33 crc kubenswrapper[4580]: E0321 05:28:33.650438 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0baf00f1aa3cb6e7ea33d455588898fb2939c88244018d333d1bbcef16356091\": container with ID starting with 0baf00f1aa3cb6e7ea33d455588898fb2939c88244018d333d1bbcef16356091 not found: ID does not exist" containerID="0baf00f1aa3cb6e7ea33d455588898fb2939c88244018d333d1bbcef16356091" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.650468 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0baf00f1aa3cb6e7ea33d455588898fb2939c88244018d333d1bbcef16356091"} err="failed to get container status \"0baf00f1aa3cb6e7ea33d455588898fb2939c88244018d333d1bbcef16356091\": rpc error: code = NotFound desc = could not find container \"0baf00f1aa3cb6e7ea33d455588898fb2939c88244018d333d1bbcef16356091\": container with ID starting with 0baf00f1aa3cb6e7ea33d455588898fb2939c88244018d333d1bbcef16356091 not found: ID does not exist" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.650489 4580 scope.go:117] "RemoveContainer" containerID="de9724f5a1eec33f3cfe0d1bc1a4602ac22d23d48477bac6b11be1fb714c494f" Mar 21 05:28:33 crc kubenswrapper[4580]: E0321 05:28:33.650853 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de9724f5a1eec33f3cfe0d1bc1a4602ac22d23d48477bac6b11be1fb714c494f\": container with ID starting with de9724f5a1eec33f3cfe0d1bc1a4602ac22d23d48477bac6b11be1fb714c494f not found: ID does not exist" containerID="de9724f5a1eec33f3cfe0d1bc1a4602ac22d23d48477bac6b11be1fb714c494f" Mar 21 05:28:33 crc kubenswrapper[4580]: I0321 05:28:33.650895 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de9724f5a1eec33f3cfe0d1bc1a4602ac22d23d48477bac6b11be1fb714c494f"} err="failed to get container status \"de9724f5a1eec33f3cfe0d1bc1a4602ac22d23d48477bac6b11be1fb714c494f\": rpc error: code = NotFound desc = could not find container \"de9724f5a1eec33f3cfe0d1bc1a4602ac22d23d48477bac6b11be1fb714c494f\": container with ID starting with de9724f5a1eec33f3cfe0d1bc1a4602ac22d23d48477bac6b11be1fb714c494f not found: ID does not exist" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.402205 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ppvkm"] Mar 21 05:28:56 crc kubenswrapper[4580]: E0321 05:28:56.404337 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0298364a-fc4c-4e99-b7ae-a74326d167b6" containerName="extract-content" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.404369 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0298364a-fc4c-4e99-b7ae-a74326d167b6" containerName="extract-content" Mar 21 05:28:56 crc kubenswrapper[4580]: E0321 05:28:56.404392 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0298364a-fc4c-4e99-b7ae-a74326d167b6" containerName="extract-utilities" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.404402 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0298364a-fc4c-4e99-b7ae-a74326d167b6" containerName="extract-utilities" Mar 21 05:28:56 crc kubenswrapper[4580]: E0321 05:28:56.404431 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0298364a-fc4c-4e99-b7ae-a74326d167b6" containerName="registry-server" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.404440 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0298364a-fc4c-4e99-b7ae-a74326d167b6" containerName="registry-server" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.405718 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="0298364a-fc4c-4e99-b7ae-a74326d167b6" containerName="registry-server" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.407472 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.427408 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ppvkm"] Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.535928 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1278b40e-b606-453b-b48c-31ddf97e78ed-utilities\") pod \"community-operators-ppvkm\" (UID: \"1278b40e-b606-453b-b48c-31ddf97e78ed\") " pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.536098 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1278b40e-b606-453b-b48c-31ddf97e78ed-catalog-content\") pod \"community-operators-ppvkm\" (UID: \"1278b40e-b606-453b-b48c-31ddf97e78ed\") " pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.536168 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8blkd\" (UniqueName: \"kubernetes.io/projected/1278b40e-b606-453b-b48c-31ddf97e78ed-kube-api-access-8blkd\") pod \"community-operators-ppvkm\" (UID: \"1278b40e-b606-453b-b48c-31ddf97e78ed\") " pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.638121 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1278b40e-b606-453b-b48c-31ddf97e78ed-utilities\") pod \"community-operators-ppvkm\" (UID: \"1278b40e-b606-453b-b48c-31ddf97e78ed\") " pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.638884 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1278b40e-b606-453b-b48c-31ddf97e78ed-utilities\") pod \"community-operators-ppvkm\" (UID: \"1278b40e-b606-453b-b48c-31ddf97e78ed\") " pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.639117 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1278b40e-b606-453b-b48c-31ddf97e78ed-catalog-content\") pod \"community-operators-ppvkm\" (UID: \"1278b40e-b606-453b-b48c-31ddf97e78ed\") " pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.639387 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1278b40e-b606-453b-b48c-31ddf97e78ed-catalog-content\") pod \"community-operators-ppvkm\" (UID: \"1278b40e-b606-453b-b48c-31ddf97e78ed\") " pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.639446 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8blkd\" (UniqueName: \"kubernetes.io/projected/1278b40e-b606-453b-b48c-31ddf97e78ed-kube-api-access-8blkd\") pod \"community-operators-ppvkm\" (UID: \"1278b40e-b606-453b-b48c-31ddf97e78ed\") " pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.661865 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8blkd\" (UniqueName: \"kubernetes.io/projected/1278b40e-b606-453b-b48c-31ddf97e78ed-kube-api-access-8blkd\") pod \"community-operators-ppvkm\" (UID: \"1278b40e-b606-453b-b48c-31ddf97e78ed\") " pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:28:56 crc kubenswrapper[4580]: I0321 05:28:56.777012 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:28:57 crc kubenswrapper[4580]: I0321 05:28:57.322670 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ppvkm"] Mar 21 05:28:57 crc kubenswrapper[4580]: I0321 05:28:57.774194 4580 generic.go:334] "Generic (PLEG): container finished" podID="1278b40e-b606-453b-b48c-31ddf97e78ed" containerID="8c07761282df3f22b19adbd554bc8185630d3b8a5e64267dd35a802fc2808ebb" exitCode=0 Mar 21 05:28:57 crc kubenswrapper[4580]: I0321 05:28:57.774389 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppvkm" event={"ID":"1278b40e-b606-453b-b48c-31ddf97e78ed","Type":"ContainerDied","Data":"8c07761282df3f22b19adbd554bc8185630d3b8a5e64267dd35a802fc2808ebb"} Mar 21 05:28:57 crc kubenswrapper[4580]: I0321 05:28:57.774495 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppvkm" event={"ID":"1278b40e-b606-453b-b48c-31ddf97e78ed","Type":"ContainerStarted","Data":"8733e132acfdaeeb2604f55b748a27ebc29ee51aee324e23fdb88a586c2dfb89"} Mar 21 05:28:58 crc kubenswrapper[4580]: I0321 05:28:58.784530 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppvkm" event={"ID":"1278b40e-b606-453b-b48c-31ddf97e78ed","Type":"ContainerStarted","Data":"905fc79cb22f27b790985aa932a46ce407b3eafc69d88d0e952652b8c1ac78ba"} Mar 21 05:29:00 crc kubenswrapper[4580]: I0321 05:29:00.800238 4580 generic.go:334] "Generic (PLEG): container finished" podID="1278b40e-b606-453b-b48c-31ddf97e78ed" containerID="905fc79cb22f27b790985aa932a46ce407b3eafc69d88d0e952652b8c1ac78ba" exitCode=0 Mar 21 05:29:00 crc kubenswrapper[4580]: I0321 05:29:00.800319 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppvkm" event={"ID":"1278b40e-b606-453b-b48c-31ddf97e78ed","Type":"ContainerDied","Data":"905fc79cb22f27b790985aa932a46ce407b3eafc69d88d0e952652b8c1ac78ba"} Mar 21 05:29:01 crc kubenswrapper[4580]: I0321 05:29:01.813111 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppvkm" event={"ID":"1278b40e-b606-453b-b48c-31ddf97e78ed","Type":"ContainerStarted","Data":"1b3ee78fd1bbdfb233237d79c317b5515fb76659bacbd4d10b0ec6a5959b74d9"} Mar 21 05:29:01 crc kubenswrapper[4580]: I0321 05:29:01.839391 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ppvkm" podStartSLOduration=2.411503047 podStartE2EDuration="5.839367909s" podCreationTimestamp="2026-03-21 05:28:56 +0000 UTC" firstStartedPulling="2026-03-21 05:28:57.776055953 +0000 UTC m=+2242.858639591" lastFinishedPulling="2026-03-21 05:29:01.203920825 +0000 UTC m=+2246.286504453" observedRunningTime="2026-03-21 05:29:01.832647748 +0000 UTC m=+2246.915231386" watchObservedRunningTime="2026-03-21 05:29:01.839367909 +0000 UTC m=+2246.921951547" Mar 21 05:29:06 crc kubenswrapper[4580]: I0321 05:29:06.778135 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:29:06 crc kubenswrapper[4580]: I0321 05:29:06.778705 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:29:06 crc kubenswrapper[4580]: I0321 05:29:06.822284 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:29:06 crc kubenswrapper[4580]: I0321 05:29:06.895344 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:29:07 crc kubenswrapper[4580]: I0321 05:29:07.063139 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ppvkm"] Mar 21 05:29:08 crc kubenswrapper[4580]: I0321 05:29:08.864980 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ppvkm" podUID="1278b40e-b606-453b-b48c-31ddf97e78ed" containerName="registry-server" containerID="cri-o://1b3ee78fd1bbdfb233237d79c317b5515fb76659bacbd4d10b0ec6a5959b74d9" gracePeriod=2 Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.355765 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.392131 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1278b40e-b606-453b-b48c-31ddf97e78ed-utilities\") pod \"1278b40e-b606-453b-b48c-31ddf97e78ed\" (UID: \"1278b40e-b606-453b-b48c-31ddf97e78ed\") " Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.392408 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1278b40e-b606-453b-b48c-31ddf97e78ed-catalog-content\") pod \"1278b40e-b606-453b-b48c-31ddf97e78ed\" (UID: \"1278b40e-b606-453b-b48c-31ddf97e78ed\") " Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.392476 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8blkd\" (UniqueName: \"kubernetes.io/projected/1278b40e-b606-453b-b48c-31ddf97e78ed-kube-api-access-8blkd\") pod \"1278b40e-b606-453b-b48c-31ddf97e78ed\" (UID: \"1278b40e-b606-453b-b48c-31ddf97e78ed\") " Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.399638 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1278b40e-b606-453b-b48c-31ddf97e78ed-kube-api-access-8blkd" (OuterVolumeSpecName: "kube-api-access-8blkd") pod "1278b40e-b606-453b-b48c-31ddf97e78ed" (UID: "1278b40e-b606-453b-b48c-31ddf97e78ed"). InnerVolumeSpecName "kube-api-access-8blkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.399696 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1278b40e-b606-453b-b48c-31ddf97e78ed-utilities" (OuterVolumeSpecName: "utilities") pod "1278b40e-b606-453b-b48c-31ddf97e78ed" (UID: "1278b40e-b606-453b-b48c-31ddf97e78ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.450588 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1278b40e-b606-453b-b48c-31ddf97e78ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1278b40e-b606-453b-b48c-31ddf97e78ed" (UID: "1278b40e-b606-453b-b48c-31ddf97e78ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.494879 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1278b40e-b606-453b-b48c-31ddf97e78ed-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.494919 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8blkd\" (UniqueName: \"kubernetes.io/projected/1278b40e-b606-453b-b48c-31ddf97e78ed-kube-api-access-8blkd\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.494929 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1278b40e-b606-453b-b48c-31ddf97e78ed-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.874251 4580 generic.go:334] "Generic (PLEG): container finished" podID="1278b40e-b606-453b-b48c-31ddf97e78ed" containerID="1b3ee78fd1bbdfb233237d79c317b5515fb76659bacbd4d10b0ec6a5959b74d9" exitCode=0 Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.874322 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppvkm" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.874333 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppvkm" event={"ID":"1278b40e-b606-453b-b48c-31ddf97e78ed","Type":"ContainerDied","Data":"1b3ee78fd1bbdfb233237d79c317b5515fb76659bacbd4d10b0ec6a5959b74d9"} Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.874630 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppvkm" event={"ID":"1278b40e-b606-453b-b48c-31ddf97e78ed","Type":"ContainerDied","Data":"8733e132acfdaeeb2604f55b748a27ebc29ee51aee324e23fdb88a586c2dfb89"} Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.874651 4580 scope.go:117] "RemoveContainer" containerID="1b3ee78fd1bbdfb233237d79c317b5515fb76659bacbd4d10b0ec6a5959b74d9" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.902494 4580 scope.go:117] "RemoveContainer" containerID="905fc79cb22f27b790985aa932a46ce407b3eafc69d88d0e952652b8c1ac78ba" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.916127 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ppvkm"] Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.940595 4580 scope.go:117] "RemoveContainer" containerID="8c07761282df3f22b19adbd554bc8185630d3b8a5e64267dd35a802fc2808ebb" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.943415 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ppvkm"] Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.990174 4580 scope.go:117] "RemoveContainer" containerID="1b3ee78fd1bbdfb233237d79c317b5515fb76659bacbd4d10b0ec6a5959b74d9" Mar 21 05:29:09 crc kubenswrapper[4580]: E0321 05:29:09.990676 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b3ee78fd1bbdfb233237d79c317b5515fb76659bacbd4d10b0ec6a5959b74d9\": container with ID starting with 1b3ee78fd1bbdfb233237d79c317b5515fb76659bacbd4d10b0ec6a5959b74d9 not found: ID does not exist" containerID="1b3ee78fd1bbdfb233237d79c317b5515fb76659bacbd4d10b0ec6a5959b74d9" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.990846 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b3ee78fd1bbdfb233237d79c317b5515fb76659bacbd4d10b0ec6a5959b74d9"} err="failed to get container status \"1b3ee78fd1bbdfb233237d79c317b5515fb76659bacbd4d10b0ec6a5959b74d9\": rpc error: code = NotFound desc = could not find container \"1b3ee78fd1bbdfb233237d79c317b5515fb76659bacbd4d10b0ec6a5959b74d9\": container with ID starting with 1b3ee78fd1bbdfb233237d79c317b5515fb76659bacbd4d10b0ec6a5959b74d9 not found: ID does not exist" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.990980 4580 scope.go:117] "RemoveContainer" containerID="905fc79cb22f27b790985aa932a46ce407b3eafc69d88d0e952652b8c1ac78ba" Mar 21 05:29:09 crc kubenswrapper[4580]: E0321 05:29:09.991431 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"905fc79cb22f27b790985aa932a46ce407b3eafc69d88d0e952652b8c1ac78ba\": container with ID starting with 905fc79cb22f27b790985aa932a46ce407b3eafc69d88d0e952652b8c1ac78ba not found: ID does not exist" containerID="905fc79cb22f27b790985aa932a46ce407b3eafc69d88d0e952652b8c1ac78ba" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.991475 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905fc79cb22f27b790985aa932a46ce407b3eafc69d88d0e952652b8c1ac78ba"} err="failed to get container status \"905fc79cb22f27b790985aa932a46ce407b3eafc69d88d0e952652b8c1ac78ba\": rpc error: code = NotFound desc = could not find container \"905fc79cb22f27b790985aa932a46ce407b3eafc69d88d0e952652b8c1ac78ba\": container with ID starting with 905fc79cb22f27b790985aa932a46ce407b3eafc69d88d0e952652b8c1ac78ba not found: ID does not exist" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.991507 4580 scope.go:117] "RemoveContainer" containerID="8c07761282df3f22b19adbd554bc8185630d3b8a5e64267dd35a802fc2808ebb" Mar 21 05:29:09 crc kubenswrapper[4580]: E0321 05:29:09.991907 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c07761282df3f22b19adbd554bc8185630d3b8a5e64267dd35a802fc2808ebb\": container with ID starting with 8c07761282df3f22b19adbd554bc8185630d3b8a5e64267dd35a802fc2808ebb not found: ID does not exist" containerID="8c07761282df3f22b19adbd554bc8185630d3b8a5e64267dd35a802fc2808ebb" Mar 21 05:29:09 crc kubenswrapper[4580]: I0321 05:29:09.992020 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c07761282df3f22b19adbd554bc8185630d3b8a5e64267dd35a802fc2808ebb"} err="failed to get container status \"8c07761282df3f22b19adbd554bc8185630d3b8a5e64267dd35a802fc2808ebb\": rpc error: code = NotFound desc = could not find container \"8c07761282df3f22b19adbd554bc8185630d3b8a5e64267dd35a802fc2808ebb\": container with ID starting with 8c07761282df3f22b19adbd554bc8185630d3b8a5e64267dd35a802fc2808ebb not found: ID does not exist" Mar 21 05:29:11 crc kubenswrapper[4580]: I0321 05:29:11.628205 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1278b40e-b606-453b-b48c-31ddf97e78ed" path="/var/lib/kubelet/pods/1278b40e-b606-453b-b48c-31ddf97e78ed/volumes" Mar 21 05:29:15 crc kubenswrapper[4580]: I0321 05:29:15.947630 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:29:15 crc kubenswrapper[4580]: I0321 05:29:15.948225 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.013155 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6q428"] Mar 21 05:29:23 crc kubenswrapper[4580]: E0321 05:29:23.014149 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1278b40e-b606-453b-b48c-31ddf97e78ed" containerName="registry-server" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.014167 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1278b40e-b606-453b-b48c-31ddf97e78ed" containerName="registry-server" Mar 21 05:29:23 crc kubenswrapper[4580]: E0321 05:29:23.014221 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1278b40e-b606-453b-b48c-31ddf97e78ed" containerName="extract-content" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.014230 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1278b40e-b606-453b-b48c-31ddf97e78ed" containerName="extract-content" Mar 21 05:29:23 crc kubenswrapper[4580]: E0321 05:29:23.014242 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1278b40e-b606-453b-b48c-31ddf97e78ed" containerName="extract-utilities" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.014250 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1278b40e-b606-453b-b48c-31ddf97e78ed" containerName="extract-utilities" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.019182 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="1278b40e-b606-453b-b48c-31ddf97e78ed" containerName="registry-server" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.020733 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.023554 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6q428"] Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.081100 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0467c5a-5605-4a99-85ec-19b82cb85532-catalog-content\") pod \"redhat-marketplace-6q428\" (UID: \"b0467c5a-5605-4a99-85ec-19b82cb85532\") " pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.081252 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0467c5a-5605-4a99-85ec-19b82cb85532-utilities\") pod \"redhat-marketplace-6q428\" (UID: \"b0467c5a-5605-4a99-85ec-19b82cb85532\") " pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.081545 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42tqn\" (UniqueName: \"kubernetes.io/projected/b0467c5a-5605-4a99-85ec-19b82cb85532-kube-api-access-42tqn\") pod \"redhat-marketplace-6q428\" (UID: \"b0467c5a-5605-4a99-85ec-19b82cb85532\") " pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.183214 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0467c5a-5605-4a99-85ec-19b82cb85532-catalog-content\") pod \"redhat-marketplace-6q428\" (UID: \"b0467c5a-5605-4a99-85ec-19b82cb85532\") " pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.183326 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0467c5a-5605-4a99-85ec-19b82cb85532-utilities\") pod \"redhat-marketplace-6q428\" (UID: \"b0467c5a-5605-4a99-85ec-19b82cb85532\") " pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.183410 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42tqn\" (UniqueName: \"kubernetes.io/projected/b0467c5a-5605-4a99-85ec-19b82cb85532-kube-api-access-42tqn\") pod \"redhat-marketplace-6q428\" (UID: \"b0467c5a-5605-4a99-85ec-19b82cb85532\") " pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.184044 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0467c5a-5605-4a99-85ec-19b82cb85532-utilities\") pod \"redhat-marketplace-6q428\" (UID: \"b0467c5a-5605-4a99-85ec-19b82cb85532\") " pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.184091 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0467c5a-5605-4a99-85ec-19b82cb85532-catalog-content\") pod \"redhat-marketplace-6q428\" (UID: \"b0467c5a-5605-4a99-85ec-19b82cb85532\") " pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.209167 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42tqn\" (UniqueName: \"kubernetes.io/projected/b0467c5a-5605-4a99-85ec-19b82cb85532-kube-api-access-42tqn\") pod \"redhat-marketplace-6q428\" (UID: \"b0467c5a-5605-4a99-85ec-19b82cb85532\") " pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.353432 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:23 crc kubenswrapper[4580]: I0321 05:29:23.734662 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6q428"] Mar 21 05:29:24 crc kubenswrapper[4580]: I0321 05:29:24.001344 4580 generic.go:334] "Generic (PLEG): container finished" podID="b0467c5a-5605-4a99-85ec-19b82cb85532" containerID="f5b6b27a5397f04a3e0f03cbe80d97e46f8484197dd4ac7810f5bc7b4a0083eb" exitCode=0 Mar 21 05:29:24 crc kubenswrapper[4580]: I0321 05:29:24.001402 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6q428" event={"ID":"b0467c5a-5605-4a99-85ec-19b82cb85532","Type":"ContainerDied","Data":"f5b6b27a5397f04a3e0f03cbe80d97e46f8484197dd4ac7810f5bc7b4a0083eb"} Mar 21 05:29:24 crc kubenswrapper[4580]: I0321 05:29:24.001697 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6q428" event={"ID":"b0467c5a-5605-4a99-85ec-19b82cb85532","Type":"ContainerStarted","Data":"36da780c7df7f1e5cb29f46b98b0138c1f8dee00a2e8a0de123aa8a87453efbd"} Mar 21 05:29:24 crc kubenswrapper[4580]: I0321 05:29:24.004472 4580 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:29:25 crc kubenswrapper[4580]: I0321 05:29:25.026683 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6q428" event={"ID":"b0467c5a-5605-4a99-85ec-19b82cb85532","Type":"ContainerStarted","Data":"8d84da3e9035cf308aae07521766f199a50acd49ca2d778dfc8025e357fc7132"} Mar 21 05:29:27 crc kubenswrapper[4580]: I0321 05:29:27.046318 4580 generic.go:334] "Generic (PLEG): container finished" podID="b0467c5a-5605-4a99-85ec-19b82cb85532" containerID="8d84da3e9035cf308aae07521766f199a50acd49ca2d778dfc8025e357fc7132" exitCode=0 Mar 21 05:29:27 crc kubenswrapper[4580]: I0321 05:29:27.046406 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6q428" event={"ID":"b0467c5a-5605-4a99-85ec-19b82cb85532","Type":"ContainerDied","Data":"8d84da3e9035cf308aae07521766f199a50acd49ca2d778dfc8025e357fc7132"} Mar 21 05:29:28 crc kubenswrapper[4580]: I0321 05:29:28.056497 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6q428" event={"ID":"b0467c5a-5605-4a99-85ec-19b82cb85532","Type":"ContainerStarted","Data":"d5fd1092ebe2401bbb56ed7fc5cd1e5f7321322463ff35f5eab78ed4b9c5c103"} Mar 21 05:29:28 crc kubenswrapper[4580]: I0321 05:29:28.085368 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6q428" podStartSLOduration=2.582449876 podStartE2EDuration="6.085342771s" podCreationTimestamp="2026-03-21 05:29:22 +0000 UTC" firstStartedPulling="2026-03-21 05:29:24.004121147 +0000 UTC m=+2269.086704775" lastFinishedPulling="2026-03-21 05:29:27.507014042 +0000 UTC m=+2272.589597670" observedRunningTime="2026-03-21 05:29:28.081844907 +0000 UTC m=+2273.164428535" watchObservedRunningTime="2026-03-21 05:29:28.085342771 +0000 UTC m=+2273.167926399" Mar 21 05:29:33 crc kubenswrapper[4580]: I0321 05:29:33.354852 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:33 crc kubenswrapper[4580]: I0321 05:29:33.355621 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:33 crc kubenswrapper[4580]: I0321 05:29:33.414718 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:34 crc kubenswrapper[4580]: I0321 05:29:34.173005 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:34 crc kubenswrapper[4580]: I0321 05:29:34.236612 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6q428"] Mar 21 05:29:36 crc kubenswrapper[4580]: I0321 05:29:36.129542 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6q428" podUID="b0467c5a-5605-4a99-85ec-19b82cb85532" containerName="registry-server" containerID="cri-o://d5fd1092ebe2401bbb56ed7fc5cd1e5f7321322463ff35f5eab78ed4b9c5c103" gracePeriod=2 Mar 21 05:29:36 crc kubenswrapper[4580]: I0321 05:29:36.571387 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:36 crc kubenswrapper[4580]: I0321 05:29:36.655036 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0467c5a-5605-4a99-85ec-19b82cb85532-utilities\") pod \"b0467c5a-5605-4a99-85ec-19b82cb85532\" (UID: \"b0467c5a-5605-4a99-85ec-19b82cb85532\") " Mar 21 05:29:36 crc kubenswrapper[4580]: I0321 05:29:36.655932 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0467c5a-5605-4a99-85ec-19b82cb85532-utilities" (OuterVolumeSpecName: "utilities") pod "b0467c5a-5605-4a99-85ec-19b82cb85532" (UID: "b0467c5a-5605-4a99-85ec-19b82cb85532"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:29:36 crc kubenswrapper[4580]: I0321 05:29:36.656017 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42tqn\" (UniqueName: \"kubernetes.io/projected/b0467c5a-5605-4a99-85ec-19b82cb85532-kube-api-access-42tqn\") pod \"b0467c5a-5605-4a99-85ec-19b82cb85532\" (UID: \"b0467c5a-5605-4a99-85ec-19b82cb85532\") " Mar 21 05:29:36 crc kubenswrapper[4580]: I0321 05:29:36.656053 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0467c5a-5605-4a99-85ec-19b82cb85532-catalog-content\") pod \"b0467c5a-5605-4a99-85ec-19b82cb85532\" (UID: \"b0467c5a-5605-4a99-85ec-19b82cb85532\") " Mar 21 05:29:36 crc kubenswrapper[4580]: I0321 05:29:36.657415 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0467c5a-5605-4a99-85ec-19b82cb85532-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:36 crc kubenswrapper[4580]: I0321 05:29:36.661605 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0467c5a-5605-4a99-85ec-19b82cb85532-kube-api-access-42tqn" (OuterVolumeSpecName: "kube-api-access-42tqn") pod "b0467c5a-5605-4a99-85ec-19b82cb85532" (UID: "b0467c5a-5605-4a99-85ec-19b82cb85532"). InnerVolumeSpecName "kube-api-access-42tqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:29:36 crc kubenswrapper[4580]: I0321 05:29:36.684277 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0467c5a-5605-4a99-85ec-19b82cb85532-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0467c5a-5605-4a99-85ec-19b82cb85532" (UID: "b0467c5a-5605-4a99-85ec-19b82cb85532"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:29:36 crc kubenswrapper[4580]: I0321 05:29:36.759919 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42tqn\" (UniqueName: \"kubernetes.io/projected/b0467c5a-5605-4a99-85ec-19b82cb85532-kube-api-access-42tqn\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:36 crc kubenswrapper[4580]: I0321 05:29:36.759996 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0467c5a-5605-4a99-85ec-19b82cb85532-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.139969 4580 generic.go:334] "Generic (PLEG): container finished" podID="b0467c5a-5605-4a99-85ec-19b82cb85532" containerID="d5fd1092ebe2401bbb56ed7fc5cd1e5f7321322463ff35f5eab78ed4b9c5c103" exitCode=0 Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.140016 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6q428" event={"ID":"b0467c5a-5605-4a99-85ec-19b82cb85532","Type":"ContainerDied","Data":"d5fd1092ebe2401bbb56ed7fc5cd1e5f7321322463ff35f5eab78ed4b9c5c103"} Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.140348 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6q428" event={"ID":"b0467c5a-5605-4a99-85ec-19b82cb85532","Type":"ContainerDied","Data":"36da780c7df7f1e5cb29f46b98b0138c1f8dee00a2e8a0de123aa8a87453efbd"} Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.140390 4580 scope.go:117] "RemoveContainer" containerID="d5fd1092ebe2401bbb56ed7fc5cd1e5f7321322463ff35f5eab78ed4b9c5c103" Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.140071 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6q428" Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.162120 4580 scope.go:117] "RemoveContainer" containerID="8d84da3e9035cf308aae07521766f199a50acd49ca2d778dfc8025e357fc7132" Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.186847 4580 scope.go:117] "RemoveContainer" containerID="f5b6b27a5397f04a3e0f03cbe80d97e46f8484197dd4ac7810f5bc7b4a0083eb" Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.196983 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6q428"] Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.208733 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6q428"] Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.245478 4580 scope.go:117] "RemoveContainer" containerID="d5fd1092ebe2401bbb56ed7fc5cd1e5f7321322463ff35f5eab78ed4b9c5c103" Mar 21 05:29:37 crc kubenswrapper[4580]: E0321 05:29:37.247055 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5fd1092ebe2401bbb56ed7fc5cd1e5f7321322463ff35f5eab78ed4b9c5c103\": container with ID starting with d5fd1092ebe2401bbb56ed7fc5cd1e5f7321322463ff35f5eab78ed4b9c5c103 not found: ID does not exist" containerID="d5fd1092ebe2401bbb56ed7fc5cd1e5f7321322463ff35f5eab78ed4b9c5c103" Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.247094 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5fd1092ebe2401bbb56ed7fc5cd1e5f7321322463ff35f5eab78ed4b9c5c103"} err="failed to get container status \"d5fd1092ebe2401bbb56ed7fc5cd1e5f7321322463ff35f5eab78ed4b9c5c103\": rpc error: code = NotFound desc = could not find container \"d5fd1092ebe2401bbb56ed7fc5cd1e5f7321322463ff35f5eab78ed4b9c5c103\": container with ID starting with d5fd1092ebe2401bbb56ed7fc5cd1e5f7321322463ff35f5eab78ed4b9c5c103 not found: ID does not exist" Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.247123 4580 scope.go:117] "RemoveContainer" containerID="8d84da3e9035cf308aae07521766f199a50acd49ca2d778dfc8025e357fc7132" Mar 21 05:29:37 crc kubenswrapper[4580]: E0321 05:29:37.247706 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d84da3e9035cf308aae07521766f199a50acd49ca2d778dfc8025e357fc7132\": container with ID starting with 8d84da3e9035cf308aae07521766f199a50acd49ca2d778dfc8025e357fc7132 not found: ID does not exist" containerID="8d84da3e9035cf308aae07521766f199a50acd49ca2d778dfc8025e357fc7132" Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.247735 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d84da3e9035cf308aae07521766f199a50acd49ca2d778dfc8025e357fc7132"} err="failed to get container status \"8d84da3e9035cf308aae07521766f199a50acd49ca2d778dfc8025e357fc7132\": rpc error: code = NotFound desc = could not find container \"8d84da3e9035cf308aae07521766f199a50acd49ca2d778dfc8025e357fc7132\": container with ID starting with 8d84da3e9035cf308aae07521766f199a50acd49ca2d778dfc8025e357fc7132 not found: ID does not exist" Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.247754 4580 scope.go:117] "RemoveContainer" containerID="f5b6b27a5397f04a3e0f03cbe80d97e46f8484197dd4ac7810f5bc7b4a0083eb" Mar 21 05:29:37 crc kubenswrapper[4580]: E0321 05:29:37.248277 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5b6b27a5397f04a3e0f03cbe80d97e46f8484197dd4ac7810f5bc7b4a0083eb\": container with ID starting with f5b6b27a5397f04a3e0f03cbe80d97e46f8484197dd4ac7810f5bc7b4a0083eb not found: ID does not exist" containerID="f5b6b27a5397f04a3e0f03cbe80d97e46f8484197dd4ac7810f5bc7b4a0083eb" Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.248302 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b6b27a5397f04a3e0f03cbe80d97e46f8484197dd4ac7810f5bc7b4a0083eb"} err="failed to get container status \"f5b6b27a5397f04a3e0f03cbe80d97e46f8484197dd4ac7810f5bc7b4a0083eb\": rpc error: code = NotFound desc = could not find container \"f5b6b27a5397f04a3e0f03cbe80d97e46f8484197dd4ac7810f5bc7b4a0083eb\": container with ID starting with f5b6b27a5397f04a3e0f03cbe80d97e46f8484197dd4ac7810f5bc7b4a0083eb not found: ID does not exist" Mar 21 05:29:37 crc kubenswrapper[4580]: I0321 05:29:37.628356 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0467c5a-5605-4a99-85ec-19b82cb85532" path="/var/lib/kubelet/pods/b0467c5a-5605-4a99-85ec-19b82cb85532/volumes" Mar 21 05:29:39 crc kubenswrapper[4580]: I0321 05:29:39.159071 4580 generic.go:334] "Generic (PLEG): container finished" podID="fe57de6b-1ee3-4bdb-91b8-d81369a7fc72" containerID="f6b66a52d85e2e967f2132e5440c3ec9765a7bad220b8029d23335cb92eca2d0" exitCode=0 Mar 21 05:29:39 crc kubenswrapper[4580]: I0321 05:29:39.159110 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" event={"ID":"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72","Type":"ContainerDied","Data":"f6b66a52d85e2e967f2132e5440c3ec9765a7bad220b8029d23335cb92eca2d0"} Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.553445 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.652857 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ovncontroller-config-0\") pod \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.652959 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ssh-key-openstack-edpm-ipam\") pod \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.653041 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ovn-combined-ca-bundle\") pod \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.653147 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-inventory\") pod \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.653212 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v78pk\" (UniqueName: \"kubernetes.io/projected/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-kube-api-access-v78pk\") pod \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\" (UID: \"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72\") " Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.665129 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "fe57de6b-1ee3-4bdb-91b8-d81369a7fc72" (UID: "fe57de6b-1ee3-4bdb-91b8-d81369a7fc72"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.665199 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-kube-api-access-v78pk" (OuterVolumeSpecName: "kube-api-access-v78pk") pod "fe57de6b-1ee3-4bdb-91b8-d81369a7fc72" (UID: "fe57de6b-1ee3-4bdb-91b8-d81369a7fc72"). InnerVolumeSpecName "kube-api-access-v78pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.679334 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "fe57de6b-1ee3-4bdb-91b8-d81369a7fc72" (UID: "fe57de6b-1ee3-4bdb-91b8-d81369a7fc72"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.680286 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-inventory" (OuterVolumeSpecName: "inventory") pod "fe57de6b-1ee3-4bdb-91b8-d81369a7fc72" (UID: "fe57de6b-1ee3-4bdb-91b8-d81369a7fc72"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.690169 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fe57de6b-1ee3-4bdb-91b8-d81369a7fc72" (UID: "fe57de6b-1ee3-4bdb-91b8-d81369a7fc72"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.755563 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v78pk\" (UniqueName: \"kubernetes.io/projected/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-kube-api-access-v78pk\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.755846 4580 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.755855 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.755864 4580 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:40 crc kubenswrapper[4580]: I0321 05:29:40.755873 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe57de6b-1ee3-4bdb-91b8-d81369a7fc72-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.179983 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" event={"ID":"fe57de6b-1ee3-4bdb-91b8-d81369a7fc72","Type":"ContainerDied","Data":"dc4c88ca94f6e27d04953844fdd5c0a807e111a31e471af309b374f7184e484c"} Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.180026 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc4c88ca94f6e27d04953844fdd5c0a807e111a31e471af309b374f7184e484c" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.180082 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-j2klj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.293352 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj"] Mar 21 05:29:41 crc kubenswrapper[4580]: E0321 05:29:41.293709 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0467c5a-5605-4a99-85ec-19b82cb85532" containerName="registry-server" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.293723 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0467c5a-5605-4a99-85ec-19b82cb85532" containerName="registry-server" Mar 21 05:29:41 crc kubenswrapper[4580]: E0321 05:29:41.293752 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0467c5a-5605-4a99-85ec-19b82cb85532" containerName="extract-content" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.293758 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0467c5a-5605-4a99-85ec-19b82cb85532" containerName="extract-content" Mar 21 05:29:41 crc kubenswrapper[4580]: E0321 05:29:41.293768 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe57de6b-1ee3-4bdb-91b8-d81369a7fc72" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.293774 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe57de6b-1ee3-4bdb-91b8-d81369a7fc72" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 21 05:29:41 crc kubenswrapper[4580]: E0321 05:29:41.293799 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0467c5a-5605-4a99-85ec-19b82cb85532" containerName="extract-utilities" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.293806 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0467c5a-5605-4a99-85ec-19b82cb85532" containerName="extract-utilities" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.293974 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0467c5a-5605-4a99-85ec-19b82cb85532" containerName="registry-server" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.293992 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe57de6b-1ee3-4bdb-91b8-d81369a7fc72" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.294635 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.297354 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.298308 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.298312 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.298500 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.298705 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.298728 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.318048 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj"] Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.366351 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc6cd\" (UniqueName: \"kubernetes.io/projected/24785e2f-2d74-4dd1-97dd-10e58843652e-kube-api-access-vc6cd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.366408 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.366444 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.366465 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.366501 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.366534 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.468123 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc6cd\" (UniqueName: \"kubernetes.io/projected/24785e2f-2d74-4dd1-97dd-10e58843652e-kube-api-access-vc6cd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.468207 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.468264 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.468291 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.468339 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.468374 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.474109 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.474221 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.476222 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.478228 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.487906 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.489669 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc6cd\" (UniqueName: \"kubernetes.io/projected/24785e2f-2d74-4dd1-97dd-10e58843652e-kube-api-access-vc6cd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:41 crc kubenswrapper[4580]: I0321 05:29:41.612708 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:29:42 crc kubenswrapper[4580]: W0321 05:29:42.147183 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24785e2f_2d74_4dd1_97dd_10e58843652e.slice/crio-35426d5522941b1419ee4f4e1eeb1e65a0b48698343200d6dbd1a6565afc0634 WatchSource:0}: Error finding container 35426d5522941b1419ee4f4e1eeb1e65a0b48698343200d6dbd1a6565afc0634: Status 404 returned error can't find the container with id 35426d5522941b1419ee4f4e1eeb1e65a0b48698343200d6dbd1a6565afc0634 Mar 21 05:29:42 crc kubenswrapper[4580]: I0321 05:29:42.149506 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj"] Mar 21 05:29:42 crc kubenswrapper[4580]: I0321 05:29:42.193383 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" event={"ID":"24785e2f-2d74-4dd1-97dd-10e58843652e","Type":"ContainerStarted","Data":"35426d5522941b1419ee4f4e1eeb1e65a0b48698343200d6dbd1a6565afc0634"} Mar 21 05:29:43 crc kubenswrapper[4580]: I0321 05:29:43.202825 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" event={"ID":"24785e2f-2d74-4dd1-97dd-10e58843652e","Type":"ContainerStarted","Data":"ebe7760ff71f61047887d9422fcf3d06025dddc03ec5e5c2ed74bec2aae65cb9"} Mar 21 05:29:43 crc kubenswrapper[4580]: I0321 05:29:43.222401 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" podStartSLOduration=1.476148211 podStartE2EDuration="2.222382539s" podCreationTimestamp="2026-03-21 05:29:41 +0000 UTC" firstStartedPulling="2026-03-21 05:29:42.149852989 +0000 UTC m=+2287.232436627" lastFinishedPulling="2026-03-21 05:29:42.896087327 +0000 UTC m=+2287.978670955" observedRunningTime="2026-03-21 05:29:43.21834929 +0000 UTC m=+2288.300932948" watchObservedRunningTime="2026-03-21 05:29:43.222382539 +0000 UTC m=+2288.304966167" Mar 21 05:29:45 crc kubenswrapper[4580]: I0321 05:29:45.947926 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:29:45 crc kubenswrapper[4580]: I0321 05:29:45.948287 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.159527 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567850-9dcqw"] Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.161462 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-9dcqw" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.164800 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.165531 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.166251 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.173661 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm"] Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.175170 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.183019 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-9dcqw"] Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.184061 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.184415 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.195482 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm"] Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.254844 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wf7b\" (UniqueName: \"kubernetes.io/projected/12dd562a-3880-43fd-a29e-daa9062324d5-kube-api-access-7wf7b\") pod \"collect-profiles-29567850-gcqmm\" (UID: \"12dd562a-3880-43fd-a29e-daa9062324d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.254987 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12dd562a-3880-43fd-a29e-daa9062324d5-config-volume\") pod \"collect-profiles-29567850-gcqmm\" (UID: \"12dd562a-3880-43fd-a29e-daa9062324d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.255040 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/12dd562a-3880-43fd-a29e-daa9062324d5-secret-volume\") pod \"collect-profiles-29567850-gcqmm\" (UID: \"12dd562a-3880-43fd-a29e-daa9062324d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.255191 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4twg\" (UniqueName: \"kubernetes.io/projected/b0bb77cf-8c33-4f71-b2ba-8ad78fd21467-kube-api-access-c4twg\") pod \"auto-csr-approver-29567850-9dcqw\" (UID: \"b0bb77cf-8c33-4f71-b2ba-8ad78fd21467\") " pod="openshift-infra/auto-csr-approver-29567850-9dcqw" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.357767 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/12dd562a-3880-43fd-a29e-daa9062324d5-secret-volume\") pod \"collect-profiles-29567850-gcqmm\" (UID: \"12dd562a-3880-43fd-a29e-daa9062324d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.357877 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4twg\" (UniqueName: \"kubernetes.io/projected/b0bb77cf-8c33-4f71-b2ba-8ad78fd21467-kube-api-access-c4twg\") pod \"auto-csr-approver-29567850-9dcqw\" (UID: \"b0bb77cf-8c33-4f71-b2ba-8ad78fd21467\") " pod="openshift-infra/auto-csr-approver-29567850-9dcqw" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.357942 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wf7b\" (UniqueName: \"kubernetes.io/projected/12dd562a-3880-43fd-a29e-daa9062324d5-kube-api-access-7wf7b\") pod \"collect-profiles-29567850-gcqmm\" (UID: \"12dd562a-3880-43fd-a29e-daa9062324d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.358099 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12dd562a-3880-43fd-a29e-daa9062324d5-config-volume\") pod \"collect-profiles-29567850-gcqmm\" (UID: \"12dd562a-3880-43fd-a29e-daa9062324d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.358973 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12dd562a-3880-43fd-a29e-daa9062324d5-config-volume\") pod \"collect-profiles-29567850-gcqmm\" (UID: \"12dd562a-3880-43fd-a29e-daa9062324d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.373853 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/12dd562a-3880-43fd-a29e-daa9062324d5-secret-volume\") pod \"collect-profiles-29567850-gcqmm\" (UID: \"12dd562a-3880-43fd-a29e-daa9062324d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.375143 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wf7b\" (UniqueName: \"kubernetes.io/projected/12dd562a-3880-43fd-a29e-daa9062324d5-kube-api-access-7wf7b\") pod \"collect-profiles-29567850-gcqmm\" (UID: \"12dd562a-3880-43fd-a29e-daa9062324d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.385877 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4twg\" (UniqueName: \"kubernetes.io/projected/b0bb77cf-8c33-4f71-b2ba-8ad78fd21467-kube-api-access-c4twg\") pod \"auto-csr-approver-29567850-9dcqw\" (UID: \"b0bb77cf-8c33-4f71-b2ba-8ad78fd21467\") " pod="openshift-infra/auto-csr-approver-29567850-9dcqw" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.489024 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-9dcqw" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.505022 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" Mar 21 05:30:00 crc kubenswrapper[4580]: I0321 05:30:00.989088 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-9dcqw"] Mar 21 05:30:01 crc kubenswrapper[4580]: W0321 05:30:01.060918 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12dd562a_3880_43fd_a29e_daa9062324d5.slice/crio-e196d35561f68c792be86c3f134d4af30d5a794c446552e4858eb9c0d629335a WatchSource:0}: Error finding container e196d35561f68c792be86c3f134d4af30d5a794c446552e4858eb9c0d629335a: Status 404 returned error can't find the container with id e196d35561f68c792be86c3f134d4af30d5a794c446552e4858eb9c0d629335a Mar 21 05:30:01 crc kubenswrapper[4580]: I0321 05:30:01.070811 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm"] Mar 21 05:30:01 crc kubenswrapper[4580]: I0321 05:30:01.346290 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" event={"ID":"12dd562a-3880-43fd-a29e-daa9062324d5","Type":"ContainerStarted","Data":"ca42d24d702749bc9d20d9901d6f0807acd843939d207e0982a7a26f955dd89c"} Mar 21 05:30:01 crc kubenswrapper[4580]: I0321 05:30:01.346337 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" event={"ID":"12dd562a-3880-43fd-a29e-daa9062324d5","Type":"ContainerStarted","Data":"e196d35561f68c792be86c3f134d4af30d5a794c446552e4858eb9c0d629335a"} Mar 21 05:30:01 crc kubenswrapper[4580]: I0321 05:30:01.352442 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567850-9dcqw" event={"ID":"b0bb77cf-8c33-4f71-b2ba-8ad78fd21467","Type":"ContainerStarted","Data":"c628251034bcb7a14211bf3af6d0f94e8cda1846d30d7a396d1b2005cb1d5b56"} Mar 21 05:30:01 crc kubenswrapper[4580]: I0321 05:30:01.371107 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" podStartSLOduration=1.37108977 podStartE2EDuration="1.37108977s" podCreationTimestamp="2026-03-21 05:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:30:01.36254287 +0000 UTC m=+2306.445126498" watchObservedRunningTime="2026-03-21 05:30:01.37108977 +0000 UTC m=+2306.453673398" Mar 21 05:30:02 crc kubenswrapper[4580]: I0321 05:30:02.365120 4580 generic.go:334] "Generic (PLEG): container finished" podID="12dd562a-3880-43fd-a29e-daa9062324d5" containerID="ca42d24d702749bc9d20d9901d6f0807acd843939d207e0982a7a26f955dd89c" exitCode=0 Mar 21 05:30:02 crc kubenswrapper[4580]: I0321 05:30:02.365285 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" event={"ID":"12dd562a-3880-43fd-a29e-daa9062324d5","Type":"ContainerDied","Data":"ca42d24d702749bc9d20d9901d6f0807acd843939d207e0982a7a26f955dd89c"} Mar 21 05:30:03 crc kubenswrapper[4580]: I0321 05:30:03.389303 4580 generic.go:334] "Generic (PLEG): container finished" podID="b0bb77cf-8c33-4f71-b2ba-8ad78fd21467" containerID="30bed1e95a5e72b436b1e97c397d6b82dea02f5a02db029106185e3efed54d59" exitCode=0 Mar 21 05:30:03 crc kubenswrapper[4580]: I0321 05:30:03.389341 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567850-9dcqw" event={"ID":"b0bb77cf-8c33-4f71-b2ba-8ad78fd21467","Type":"ContainerDied","Data":"30bed1e95a5e72b436b1e97c397d6b82dea02f5a02db029106185e3efed54d59"} Mar 21 05:30:03 crc kubenswrapper[4580]: I0321 05:30:03.713521 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" Mar 21 05:30:03 crc kubenswrapper[4580]: I0321 05:30:03.828731 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wf7b\" (UniqueName: \"kubernetes.io/projected/12dd562a-3880-43fd-a29e-daa9062324d5-kube-api-access-7wf7b\") pod \"12dd562a-3880-43fd-a29e-daa9062324d5\" (UID: \"12dd562a-3880-43fd-a29e-daa9062324d5\") " Mar 21 05:30:03 crc kubenswrapper[4580]: I0321 05:30:03.828806 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12dd562a-3880-43fd-a29e-daa9062324d5-config-volume\") pod \"12dd562a-3880-43fd-a29e-daa9062324d5\" (UID: \"12dd562a-3880-43fd-a29e-daa9062324d5\") " Mar 21 05:30:03 crc kubenswrapper[4580]: I0321 05:30:03.828863 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/12dd562a-3880-43fd-a29e-daa9062324d5-secret-volume\") pod \"12dd562a-3880-43fd-a29e-daa9062324d5\" (UID: \"12dd562a-3880-43fd-a29e-daa9062324d5\") " Mar 21 05:30:03 crc kubenswrapper[4580]: I0321 05:30:03.829901 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12dd562a-3880-43fd-a29e-daa9062324d5-config-volume" (OuterVolumeSpecName: "config-volume") pod "12dd562a-3880-43fd-a29e-daa9062324d5" (UID: "12dd562a-3880-43fd-a29e-daa9062324d5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:30:03 crc kubenswrapper[4580]: I0321 05:30:03.830415 4580 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12dd562a-3880-43fd-a29e-daa9062324d5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:03 crc kubenswrapper[4580]: I0321 05:30:03.835005 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12dd562a-3880-43fd-a29e-daa9062324d5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "12dd562a-3880-43fd-a29e-daa9062324d5" (UID: "12dd562a-3880-43fd-a29e-daa9062324d5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:30:03 crc kubenswrapper[4580]: I0321 05:30:03.836171 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12dd562a-3880-43fd-a29e-daa9062324d5-kube-api-access-7wf7b" (OuterVolumeSpecName: "kube-api-access-7wf7b") pod "12dd562a-3880-43fd-a29e-daa9062324d5" (UID: "12dd562a-3880-43fd-a29e-daa9062324d5"). InnerVolumeSpecName "kube-api-access-7wf7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:30:03 crc kubenswrapper[4580]: I0321 05:30:03.932628 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wf7b\" (UniqueName: \"kubernetes.io/projected/12dd562a-3880-43fd-a29e-daa9062324d5-kube-api-access-7wf7b\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:03 crc kubenswrapper[4580]: I0321 05:30:03.932656 4580 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/12dd562a-3880-43fd-a29e-daa9062324d5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:04 crc kubenswrapper[4580]: I0321 05:30:04.400908 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" event={"ID":"12dd562a-3880-43fd-a29e-daa9062324d5","Type":"ContainerDied","Data":"e196d35561f68c792be86c3f134d4af30d5a794c446552e4858eb9c0d629335a"} Mar 21 05:30:04 crc kubenswrapper[4580]: I0321 05:30:04.400950 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e196d35561f68c792be86c3f134d4af30d5a794c446552e4858eb9c0d629335a" Mar 21 05:30:04 crc kubenswrapper[4580]: I0321 05:30:04.400954 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-gcqmm" Mar 21 05:30:04 crc kubenswrapper[4580]: I0321 05:30:04.446685 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq"] Mar 21 05:30:04 crc kubenswrapper[4580]: I0321 05:30:04.455775 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-jljjq"] Mar 21 05:30:04 crc kubenswrapper[4580]: I0321 05:30:04.712058 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-9dcqw" Mar 21 05:30:04 crc kubenswrapper[4580]: I0321 05:30:04.747102 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4twg\" (UniqueName: \"kubernetes.io/projected/b0bb77cf-8c33-4f71-b2ba-8ad78fd21467-kube-api-access-c4twg\") pod \"b0bb77cf-8c33-4f71-b2ba-8ad78fd21467\" (UID: \"b0bb77cf-8c33-4f71-b2ba-8ad78fd21467\") " Mar 21 05:30:04 crc kubenswrapper[4580]: I0321 05:30:04.751590 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0bb77cf-8c33-4f71-b2ba-8ad78fd21467-kube-api-access-c4twg" (OuterVolumeSpecName: "kube-api-access-c4twg") pod "b0bb77cf-8c33-4f71-b2ba-8ad78fd21467" (UID: "b0bb77cf-8c33-4f71-b2ba-8ad78fd21467"). InnerVolumeSpecName "kube-api-access-c4twg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:30:04 crc kubenswrapper[4580]: I0321 05:30:04.849596 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4twg\" (UniqueName: \"kubernetes.io/projected/b0bb77cf-8c33-4f71-b2ba-8ad78fd21467-kube-api-access-c4twg\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:05 crc kubenswrapper[4580]: I0321 05:30:05.410138 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567850-9dcqw" event={"ID":"b0bb77cf-8c33-4f71-b2ba-8ad78fd21467","Type":"ContainerDied","Data":"c628251034bcb7a14211bf3af6d0f94e8cda1846d30d7a396d1b2005cb1d5b56"} Mar 21 05:30:05 crc kubenswrapper[4580]: I0321 05:30:05.410404 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c628251034bcb7a14211bf3af6d0f94e8cda1846d30d7a396d1b2005cb1d5b56" Mar 21 05:30:05 crc kubenswrapper[4580]: I0321 05:30:05.410202 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-9dcqw" Mar 21 05:30:05 crc kubenswrapper[4580]: I0321 05:30:05.631262 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92b13d13-f88e-47cc-8815-34b54fd68711" path="/var/lib/kubelet/pods/92b13d13-f88e-47cc-8815-34b54fd68711/volumes" Mar 21 05:30:05 crc kubenswrapper[4580]: I0321 05:30:05.774048 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-nn75n"] Mar 21 05:30:05 crc kubenswrapper[4580]: I0321 05:30:05.782511 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-nn75n"] Mar 21 05:30:07 crc kubenswrapper[4580]: I0321 05:30:07.629837 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="487961e4-9ba8-498a-8330-ef215db7eb8e" path="/var/lib/kubelet/pods/487961e4-9ba8-498a-8330-ef215db7eb8e/volumes" Mar 21 05:30:14 crc kubenswrapper[4580]: I0321 05:30:14.059916 4580 scope.go:117] "RemoveContainer" containerID="a6be6f616e65ef14a7724f8ea93ae87aaa63f9e0f48e3f37e0b4c8261348e254" Mar 21 05:30:14 crc kubenswrapper[4580]: I0321 05:30:14.096857 4580 scope.go:117] "RemoveContainer" containerID="5f9d64afaa5aed2bc32b9e3a39c16db91918bf5aa80be5631dae3f3adc43f0f1" Mar 21 05:30:15 crc kubenswrapper[4580]: I0321 05:30:15.948283 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:30:15 crc kubenswrapper[4580]: I0321 05:30:15.949327 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:30:15 crc kubenswrapper[4580]: I0321 05:30:15.949381 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 05:30:15 crc kubenswrapper[4580]: I0321 05:30:15.950279 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bcd83d2a37c6af8524563e27b746db2c348fe00e999b54fcbbce11e47079c45b"} pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:30:15 crc kubenswrapper[4580]: I0321 05:30:15.950351 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" containerID="cri-o://bcd83d2a37c6af8524563e27b746db2c348fe00e999b54fcbbce11e47079c45b" gracePeriod=600 Mar 21 05:30:16 crc kubenswrapper[4580]: I0321 05:30:16.514388 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerID="bcd83d2a37c6af8524563e27b746db2c348fe00e999b54fcbbce11e47079c45b" exitCode=0 Mar 21 05:30:16 crc kubenswrapper[4580]: I0321 05:30:16.514460 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerDied","Data":"bcd83d2a37c6af8524563e27b746db2c348fe00e999b54fcbbce11e47079c45b"} Mar 21 05:30:16 crc kubenswrapper[4580]: I0321 05:30:16.515073 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7"} Mar 21 05:30:16 crc kubenswrapper[4580]: I0321 05:30:16.515098 4580 scope.go:117] "RemoveContainer" containerID="66a22767143df86b1e4861a9d69abd599234f78bb09b1e28c3547296cb98ee45" Mar 21 05:30:30 crc kubenswrapper[4580]: I0321 05:30:30.643726 4580 generic.go:334] "Generic (PLEG): container finished" podID="24785e2f-2d74-4dd1-97dd-10e58843652e" containerID="ebe7760ff71f61047887d9422fcf3d06025dddc03ec5e5c2ed74bec2aae65cb9" exitCode=0 Mar 21 05:30:30 crc kubenswrapper[4580]: I0321 05:30:30.643834 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" event={"ID":"24785e2f-2d74-4dd1-97dd-10e58843652e","Type":"ContainerDied","Data":"ebe7760ff71f61047887d9422fcf3d06025dddc03ec5e5c2ed74bec2aae65cb9"} Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.101995 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.206874 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-ssh-key-openstack-edpm-ipam\") pod \"24785e2f-2d74-4dd1-97dd-10e58843652e\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.207059 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-inventory\") pod \"24785e2f-2d74-4dd1-97dd-10e58843652e\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.207090 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-neutron-metadata-combined-ca-bundle\") pod \"24785e2f-2d74-4dd1-97dd-10e58843652e\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.207150 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"24785e2f-2d74-4dd1-97dd-10e58843652e\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.207191 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc6cd\" (UniqueName: \"kubernetes.io/projected/24785e2f-2d74-4dd1-97dd-10e58843652e-kube-api-access-vc6cd\") pod \"24785e2f-2d74-4dd1-97dd-10e58843652e\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.207247 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-nova-metadata-neutron-config-0\") pod \"24785e2f-2d74-4dd1-97dd-10e58843652e\" (UID: \"24785e2f-2d74-4dd1-97dd-10e58843652e\") " Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.212533 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "24785e2f-2d74-4dd1-97dd-10e58843652e" (UID: "24785e2f-2d74-4dd1-97dd-10e58843652e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.213284 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24785e2f-2d74-4dd1-97dd-10e58843652e-kube-api-access-vc6cd" (OuterVolumeSpecName: "kube-api-access-vc6cd") pod "24785e2f-2d74-4dd1-97dd-10e58843652e" (UID: "24785e2f-2d74-4dd1-97dd-10e58843652e"). InnerVolumeSpecName "kube-api-access-vc6cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.233735 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-inventory" (OuterVolumeSpecName: "inventory") pod "24785e2f-2d74-4dd1-97dd-10e58843652e" (UID: "24785e2f-2d74-4dd1-97dd-10e58843652e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.238006 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "24785e2f-2d74-4dd1-97dd-10e58843652e" (UID: "24785e2f-2d74-4dd1-97dd-10e58843652e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.243493 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "24785e2f-2d74-4dd1-97dd-10e58843652e" (UID: "24785e2f-2d74-4dd1-97dd-10e58843652e"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.256423 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "24785e2f-2d74-4dd1-97dd-10e58843652e" (UID: "24785e2f-2d74-4dd1-97dd-10e58843652e"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.309566 4580 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.309611 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc6cd\" (UniqueName: \"kubernetes.io/projected/24785e2f-2d74-4dd1-97dd-10e58843652e-kube-api-access-vc6cd\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.309625 4580 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.310340 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.310372 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.310385 4580 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24785e2f-2d74-4dd1-97dd-10e58843652e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.664499 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" event={"ID":"24785e2f-2d74-4dd1-97dd-10e58843652e","Type":"ContainerDied","Data":"35426d5522941b1419ee4f4e1eeb1e65a0b48698343200d6dbd1a6565afc0634"} Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.664546 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35426d5522941b1419ee4f4e1eeb1e65a0b48698343200d6dbd1a6565afc0634" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.664609 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.757909 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv"] Mar 21 05:30:32 crc kubenswrapper[4580]: E0321 05:30:32.758402 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0bb77cf-8c33-4f71-b2ba-8ad78fd21467" containerName="oc" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.758425 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0bb77cf-8c33-4f71-b2ba-8ad78fd21467" containerName="oc" Mar 21 05:30:32 crc kubenswrapper[4580]: E0321 05:30:32.758464 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24785e2f-2d74-4dd1-97dd-10e58843652e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.758474 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="24785e2f-2d74-4dd1-97dd-10e58843652e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 21 05:30:32 crc kubenswrapper[4580]: E0321 05:30:32.758493 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12dd562a-3880-43fd-a29e-daa9062324d5" containerName="collect-profiles" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.758502 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="12dd562a-3880-43fd-a29e-daa9062324d5" containerName="collect-profiles" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.758716 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="12dd562a-3880-43fd-a29e-daa9062324d5" containerName="collect-profiles" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.758737 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0bb77cf-8c33-4f71-b2ba-8ad78fd21467" containerName="oc" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.758769 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="24785e2f-2d74-4dd1-97dd-10e58843652e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.759393 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.761584 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.761891 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.763513 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.765414 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.765629 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.795827 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv"] Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.820164 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.820228 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhkf2\" (UniqueName: \"kubernetes.io/projected/e355e210-9abe-4bdf-bcbf-70e95e437482-kube-api-access-rhkf2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.820283 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.820413 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.820574 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.922630 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.922683 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.922755 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.922866 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.922899 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhkf2\" (UniqueName: \"kubernetes.io/projected/e355e210-9abe-4bdf-bcbf-70e95e437482-kube-api-access-rhkf2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.927335 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.927860 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.927951 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.930176 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:32 crc kubenswrapper[4580]: I0321 05:30:32.939717 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhkf2\" (UniqueName: \"kubernetes.io/projected/e355e210-9abe-4bdf-bcbf-70e95e437482-kube-api-access-rhkf2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:33 crc kubenswrapper[4580]: I0321 05:30:33.082093 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:30:33 crc kubenswrapper[4580]: I0321 05:30:33.647389 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv"] Mar 21 05:30:33 crc kubenswrapper[4580]: I0321 05:30:33.682501 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" event={"ID":"e355e210-9abe-4bdf-bcbf-70e95e437482","Type":"ContainerStarted","Data":"00faafec901b10279dbeb50cdf2046f53574128c31e66fb936b39d293cb0f773"} Mar 21 05:30:34 crc kubenswrapper[4580]: I0321 05:30:34.692819 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" event={"ID":"e355e210-9abe-4bdf-bcbf-70e95e437482","Type":"ContainerStarted","Data":"a8197a60fb63f50c696bef932374d0d35c1355874672400885bdb7bfbfa74afe"} Mar 21 05:30:34 crc kubenswrapper[4580]: I0321 05:30:34.723522 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" podStartSLOduration=2.306980859 podStartE2EDuration="2.723499324s" podCreationTimestamp="2026-03-21 05:30:32 +0000 UTC" firstStartedPulling="2026-03-21 05:30:33.642238249 +0000 UTC m=+2338.724821877" lastFinishedPulling="2026-03-21 05:30:34.058756694 +0000 UTC m=+2339.141340342" observedRunningTime="2026-03-21 05:30:34.714698206 +0000 UTC m=+2339.797281834" watchObservedRunningTime="2026-03-21 05:30:34.723499324 +0000 UTC m=+2339.806082952" Mar 21 05:32:00 crc kubenswrapper[4580]: I0321 05:32:00.138041 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567852-gc7zd"] Mar 21 05:32:00 crc kubenswrapper[4580]: I0321 05:32:00.140093 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-gc7zd" Mar 21 05:32:00 crc kubenswrapper[4580]: I0321 05:32:00.143066 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:32:00 crc kubenswrapper[4580]: I0321 05:32:00.143698 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:32:00 crc kubenswrapper[4580]: I0321 05:32:00.145917 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:32:00 crc kubenswrapper[4580]: I0321 05:32:00.150511 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-gc7zd"] Mar 21 05:32:00 crc kubenswrapper[4580]: I0321 05:32:00.267032 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh78s\" (UniqueName: \"kubernetes.io/projected/fa25cff0-b73c-49be-aabf-0215e64a0cbb-kube-api-access-nh78s\") pod \"auto-csr-approver-29567852-gc7zd\" (UID: \"fa25cff0-b73c-49be-aabf-0215e64a0cbb\") " pod="openshift-infra/auto-csr-approver-29567852-gc7zd" Mar 21 05:32:00 crc kubenswrapper[4580]: I0321 05:32:00.371104 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh78s\" (UniqueName: \"kubernetes.io/projected/fa25cff0-b73c-49be-aabf-0215e64a0cbb-kube-api-access-nh78s\") pod \"auto-csr-approver-29567852-gc7zd\" (UID: \"fa25cff0-b73c-49be-aabf-0215e64a0cbb\") " pod="openshift-infra/auto-csr-approver-29567852-gc7zd" Mar 21 05:32:00 crc kubenswrapper[4580]: I0321 05:32:00.392288 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh78s\" (UniqueName: \"kubernetes.io/projected/fa25cff0-b73c-49be-aabf-0215e64a0cbb-kube-api-access-nh78s\") pod \"auto-csr-approver-29567852-gc7zd\" (UID: \"fa25cff0-b73c-49be-aabf-0215e64a0cbb\") " pod="openshift-infra/auto-csr-approver-29567852-gc7zd" Mar 21 05:32:00 crc kubenswrapper[4580]: I0321 05:32:00.458808 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-gc7zd" Mar 21 05:32:00 crc kubenswrapper[4580]: I0321 05:32:00.939036 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-gc7zd"] Mar 21 05:32:01 crc kubenswrapper[4580]: I0321 05:32:01.496316 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567852-gc7zd" event={"ID":"fa25cff0-b73c-49be-aabf-0215e64a0cbb","Type":"ContainerStarted","Data":"4cf90447404b54f36c2b21c17f70d2a61ea5ffa46b1f09eb9e43f51247bba2b5"} Mar 21 05:32:02 crc kubenswrapper[4580]: I0321 05:32:02.510051 4580 generic.go:334] "Generic (PLEG): container finished" podID="fa25cff0-b73c-49be-aabf-0215e64a0cbb" containerID="258d92896bb4f8e759af6cd8e8ba30c66812606ca78ec41821efcde13e262dc2" exitCode=0 Mar 21 05:32:02 crc kubenswrapper[4580]: I0321 05:32:02.510558 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567852-gc7zd" event={"ID":"fa25cff0-b73c-49be-aabf-0215e64a0cbb","Type":"ContainerDied","Data":"258d92896bb4f8e759af6cd8e8ba30c66812606ca78ec41821efcde13e262dc2"} Mar 21 05:32:03 crc kubenswrapper[4580]: I0321 05:32:03.810701 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-gc7zd" Mar 21 05:32:03 crc kubenswrapper[4580]: I0321 05:32:03.958756 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh78s\" (UniqueName: \"kubernetes.io/projected/fa25cff0-b73c-49be-aabf-0215e64a0cbb-kube-api-access-nh78s\") pod \"fa25cff0-b73c-49be-aabf-0215e64a0cbb\" (UID: \"fa25cff0-b73c-49be-aabf-0215e64a0cbb\") " Mar 21 05:32:03 crc kubenswrapper[4580]: I0321 05:32:03.965082 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa25cff0-b73c-49be-aabf-0215e64a0cbb-kube-api-access-nh78s" (OuterVolumeSpecName: "kube-api-access-nh78s") pod "fa25cff0-b73c-49be-aabf-0215e64a0cbb" (UID: "fa25cff0-b73c-49be-aabf-0215e64a0cbb"). InnerVolumeSpecName "kube-api-access-nh78s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:32:04 crc kubenswrapper[4580]: I0321 05:32:04.061201 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh78s\" (UniqueName: \"kubernetes.io/projected/fa25cff0-b73c-49be-aabf-0215e64a0cbb-kube-api-access-nh78s\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:04 crc kubenswrapper[4580]: I0321 05:32:04.529468 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567852-gc7zd" event={"ID":"fa25cff0-b73c-49be-aabf-0215e64a0cbb","Type":"ContainerDied","Data":"4cf90447404b54f36c2b21c17f70d2a61ea5ffa46b1f09eb9e43f51247bba2b5"} Mar 21 05:32:04 crc kubenswrapper[4580]: I0321 05:32:04.529799 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cf90447404b54f36c2b21c17f70d2a61ea5ffa46b1f09eb9e43f51247bba2b5" Mar 21 05:32:04 crc kubenswrapper[4580]: I0321 05:32:04.529499 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-gc7zd" Mar 21 05:32:04 crc kubenswrapper[4580]: I0321 05:32:04.881811 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-rszs5"] Mar 21 05:32:04 crc kubenswrapper[4580]: I0321 05:32:04.892398 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-rszs5"] Mar 21 05:32:05 crc kubenswrapper[4580]: I0321 05:32:05.632089 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="297cda95-71d8-4ad1-a1b2-a83494cb6cb6" path="/var/lib/kubelet/pods/297cda95-71d8-4ad1-a1b2-a83494cb6cb6/volumes" Mar 21 05:32:14 crc kubenswrapper[4580]: I0321 05:32:14.266439 4580 scope.go:117] "RemoveContainer" containerID="ad8d35f03fb40259df66cd1a7110886a3335661d8a6666636390f3283059d5eb" Mar 21 05:32:45 crc kubenswrapper[4580]: I0321 05:32:45.947805 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:32:45 crc kubenswrapper[4580]: I0321 05:32:45.948320 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:33:15 crc kubenswrapper[4580]: I0321 05:33:15.947495 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:33:15 crc kubenswrapper[4580]: I0321 05:33:15.947996 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:33:45 crc kubenswrapper[4580]: I0321 05:33:45.948082 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:33:45 crc kubenswrapper[4580]: I0321 05:33:45.949499 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:33:45 crc kubenswrapper[4580]: I0321 05:33:45.949613 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 05:33:45 crc kubenswrapper[4580]: I0321 05:33:45.950410 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7"} pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:33:45 crc kubenswrapper[4580]: I0321 05:33:45.950567 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" containerID="cri-o://a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" gracePeriod=600 Mar 21 05:33:46 crc kubenswrapper[4580]: E0321 05:33:46.074993 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:33:46 crc kubenswrapper[4580]: I0321 05:33:46.407444 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" exitCode=0 Mar 21 05:33:46 crc kubenswrapper[4580]: I0321 05:33:46.407493 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerDied","Data":"a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7"} Mar 21 05:33:46 crc kubenswrapper[4580]: I0321 05:33:46.407543 4580 scope.go:117] "RemoveContainer" containerID="bcd83d2a37c6af8524563e27b746db2c348fe00e999b54fcbbce11e47079c45b" Mar 21 05:33:46 crc kubenswrapper[4580]: I0321 05:33:46.408518 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:33:46 crc kubenswrapper[4580]: E0321 05:33:46.409194 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:33:58 crc kubenswrapper[4580]: I0321 05:33:58.618481 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:33:58 crc kubenswrapper[4580]: E0321 05:33:58.619250 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:34:00 crc kubenswrapper[4580]: I0321 05:34:00.144687 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567854-2v6sq"] Mar 21 05:34:00 crc kubenswrapper[4580]: E0321 05:34:00.145329 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa25cff0-b73c-49be-aabf-0215e64a0cbb" containerName="oc" Mar 21 05:34:00 crc kubenswrapper[4580]: I0321 05:34:00.145354 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa25cff0-b73c-49be-aabf-0215e64a0cbb" containerName="oc" Mar 21 05:34:00 crc kubenswrapper[4580]: I0321 05:34:00.145907 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa25cff0-b73c-49be-aabf-0215e64a0cbb" containerName="oc" Mar 21 05:34:00 crc kubenswrapper[4580]: I0321 05:34:00.146905 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-2v6sq" Mar 21 05:34:00 crc kubenswrapper[4580]: I0321 05:34:00.149663 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:34:00 crc kubenswrapper[4580]: I0321 05:34:00.151698 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:34:00 crc kubenswrapper[4580]: I0321 05:34:00.152215 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:34:00 crc kubenswrapper[4580]: I0321 05:34:00.161316 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-2v6sq"] Mar 21 05:34:00 crc kubenswrapper[4580]: I0321 05:34:00.269714 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgrd7\" (UniqueName: \"kubernetes.io/projected/77fa1559-ac28-4b0d-a925-97d89f945299-kube-api-access-fgrd7\") pod \"auto-csr-approver-29567854-2v6sq\" (UID: \"77fa1559-ac28-4b0d-a925-97d89f945299\") " pod="openshift-infra/auto-csr-approver-29567854-2v6sq" Mar 21 05:34:00 crc kubenswrapper[4580]: I0321 05:34:00.371766 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgrd7\" (UniqueName: \"kubernetes.io/projected/77fa1559-ac28-4b0d-a925-97d89f945299-kube-api-access-fgrd7\") pod \"auto-csr-approver-29567854-2v6sq\" (UID: \"77fa1559-ac28-4b0d-a925-97d89f945299\") " pod="openshift-infra/auto-csr-approver-29567854-2v6sq" Mar 21 05:34:00 crc kubenswrapper[4580]: I0321 05:34:00.393220 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgrd7\" (UniqueName: \"kubernetes.io/projected/77fa1559-ac28-4b0d-a925-97d89f945299-kube-api-access-fgrd7\") pod \"auto-csr-approver-29567854-2v6sq\" (UID: \"77fa1559-ac28-4b0d-a925-97d89f945299\") " pod="openshift-infra/auto-csr-approver-29567854-2v6sq" Mar 21 05:34:00 crc kubenswrapper[4580]: I0321 05:34:00.473609 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-2v6sq" Mar 21 05:34:00 crc kubenswrapper[4580]: I0321 05:34:00.916946 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-2v6sq"] Mar 21 05:34:00 crc kubenswrapper[4580]: W0321 05:34:00.933512 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77fa1559_ac28_4b0d_a925_97d89f945299.slice/crio-5976b0e731df2357e1ffd648845e9f214b4f806556354f14afc815601d51a756 WatchSource:0}: Error finding container 5976b0e731df2357e1ffd648845e9f214b4f806556354f14afc815601d51a756: Status 404 returned error can't find the container with id 5976b0e731df2357e1ffd648845e9f214b4f806556354f14afc815601d51a756 Mar 21 05:34:01 crc kubenswrapper[4580]: I0321 05:34:01.535509 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567854-2v6sq" event={"ID":"77fa1559-ac28-4b0d-a925-97d89f945299","Type":"ContainerStarted","Data":"5976b0e731df2357e1ffd648845e9f214b4f806556354f14afc815601d51a756"} Mar 21 05:34:02 crc kubenswrapper[4580]: I0321 05:34:02.546207 4580 generic.go:334] "Generic (PLEG): container finished" podID="77fa1559-ac28-4b0d-a925-97d89f945299" containerID="d35ced91df9bd8ebef28543dc4af9533487bbd288ae2357f3209ab10d4803691" exitCode=0 Mar 21 05:34:02 crc kubenswrapper[4580]: I0321 05:34:02.546255 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567854-2v6sq" event={"ID":"77fa1559-ac28-4b0d-a925-97d89f945299","Type":"ContainerDied","Data":"d35ced91df9bd8ebef28543dc4af9533487bbd288ae2357f3209ab10d4803691"} Mar 21 05:34:03 crc kubenswrapper[4580]: I0321 05:34:03.896823 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-2v6sq" Mar 21 05:34:03 crc kubenswrapper[4580]: I0321 05:34:03.948112 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgrd7\" (UniqueName: \"kubernetes.io/projected/77fa1559-ac28-4b0d-a925-97d89f945299-kube-api-access-fgrd7\") pod \"77fa1559-ac28-4b0d-a925-97d89f945299\" (UID: \"77fa1559-ac28-4b0d-a925-97d89f945299\") " Mar 21 05:34:03 crc kubenswrapper[4580]: I0321 05:34:03.955045 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77fa1559-ac28-4b0d-a925-97d89f945299-kube-api-access-fgrd7" (OuterVolumeSpecName: "kube-api-access-fgrd7") pod "77fa1559-ac28-4b0d-a925-97d89f945299" (UID: "77fa1559-ac28-4b0d-a925-97d89f945299"). InnerVolumeSpecName "kube-api-access-fgrd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:34:04 crc kubenswrapper[4580]: I0321 05:34:04.050158 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgrd7\" (UniqueName: \"kubernetes.io/projected/77fa1559-ac28-4b0d-a925-97d89f945299-kube-api-access-fgrd7\") on node \"crc\" DevicePath \"\"" Mar 21 05:34:04 crc kubenswrapper[4580]: I0321 05:34:04.563092 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567854-2v6sq" event={"ID":"77fa1559-ac28-4b0d-a925-97d89f945299","Type":"ContainerDied","Data":"5976b0e731df2357e1ffd648845e9f214b4f806556354f14afc815601d51a756"} Mar 21 05:34:04 crc kubenswrapper[4580]: I0321 05:34:04.563145 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5976b0e731df2357e1ffd648845e9f214b4f806556354f14afc815601d51a756" Mar 21 05:34:04 crc kubenswrapper[4580]: I0321 05:34:04.563150 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-2v6sq" Mar 21 05:34:04 crc kubenswrapper[4580]: I0321 05:34:04.967501 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-zmvbq"] Mar 21 05:34:04 crc kubenswrapper[4580]: I0321 05:34:04.976288 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-zmvbq"] Mar 21 05:34:05 crc kubenswrapper[4580]: I0321 05:34:05.628551 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65db5d38-ca46-4fd6-999e-508bbb7e49b4" path="/var/lib/kubelet/pods/65db5d38-ca46-4fd6-999e-508bbb7e49b4/volumes" Mar 21 05:34:09 crc kubenswrapper[4580]: I0321 05:34:09.617866 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:34:09 crc kubenswrapper[4580]: E0321 05:34:09.618654 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:34:14 crc kubenswrapper[4580]: I0321 05:34:14.373733 4580 scope.go:117] "RemoveContainer" containerID="cb7b6fdf73ee5a020c64564651deb11e43abf4e21e4513f25c30c1ef33928fc0" Mar 21 05:34:23 crc kubenswrapper[4580]: I0321 05:34:23.617908 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:34:23 crc kubenswrapper[4580]: E0321 05:34:23.618656 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:34:29 crc kubenswrapper[4580]: I0321 05:34:29.779615 4580 generic.go:334] "Generic (PLEG): container finished" podID="e355e210-9abe-4bdf-bcbf-70e95e437482" containerID="a8197a60fb63f50c696bef932374d0d35c1355874672400885bdb7bfbfa74afe" exitCode=0 Mar 21 05:34:29 crc kubenswrapper[4580]: I0321 05:34:29.780136 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" event={"ID":"e355e210-9abe-4bdf-bcbf-70e95e437482","Type":"ContainerDied","Data":"a8197a60fb63f50c696bef932374d0d35c1355874672400885bdb7bfbfa74afe"} Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.164337 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.298421 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-libvirt-combined-ca-bundle\") pod \"e355e210-9abe-4bdf-bcbf-70e95e437482\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.298476 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhkf2\" (UniqueName: \"kubernetes.io/projected/e355e210-9abe-4bdf-bcbf-70e95e437482-kube-api-access-rhkf2\") pod \"e355e210-9abe-4bdf-bcbf-70e95e437482\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.298531 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-libvirt-secret-0\") pod \"e355e210-9abe-4bdf-bcbf-70e95e437482\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.298664 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-inventory\") pod \"e355e210-9abe-4bdf-bcbf-70e95e437482\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.298711 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-ssh-key-openstack-edpm-ipam\") pod \"e355e210-9abe-4bdf-bcbf-70e95e437482\" (UID: \"e355e210-9abe-4bdf-bcbf-70e95e437482\") " Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.310922 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e355e210-9abe-4bdf-bcbf-70e95e437482" (UID: "e355e210-9abe-4bdf-bcbf-70e95e437482"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.311011 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e355e210-9abe-4bdf-bcbf-70e95e437482-kube-api-access-rhkf2" (OuterVolumeSpecName: "kube-api-access-rhkf2") pod "e355e210-9abe-4bdf-bcbf-70e95e437482" (UID: "e355e210-9abe-4bdf-bcbf-70e95e437482"). InnerVolumeSpecName "kube-api-access-rhkf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.329428 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "e355e210-9abe-4bdf-bcbf-70e95e437482" (UID: "e355e210-9abe-4bdf-bcbf-70e95e437482"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.329553 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e355e210-9abe-4bdf-bcbf-70e95e437482" (UID: "e355e210-9abe-4bdf-bcbf-70e95e437482"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.333982 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-inventory" (OuterVolumeSpecName: "inventory") pod "e355e210-9abe-4bdf-bcbf-70e95e437482" (UID: "e355e210-9abe-4bdf-bcbf-70e95e437482"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.401723 4580 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.401946 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.402030 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.402122 4580 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e355e210-9abe-4bdf-bcbf-70e95e437482-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.402497 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhkf2\" (UniqueName: \"kubernetes.io/projected/e355e210-9abe-4bdf-bcbf-70e95e437482-kube-api-access-rhkf2\") on node \"crc\" DevicePath \"\"" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.795971 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" event={"ID":"e355e210-9abe-4bdf-bcbf-70e95e437482","Type":"ContainerDied","Data":"00faafec901b10279dbeb50cdf2046f53574128c31e66fb936b39d293cb0f773"} Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.796010 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00faafec901b10279dbeb50cdf2046f53574128c31e66fb936b39d293cb0f773" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.796073 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.908409 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h"] Mar 21 05:34:31 crc kubenswrapper[4580]: E0321 05:34:31.909103 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e355e210-9abe-4bdf-bcbf-70e95e437482" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.909122 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e355e210-9abe-4bdf-bcbf-70e95e437482" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 21 05:34:31 crc kubenswrapper[4580]: E0321 05:34:31.909137 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77fa1559-ac28-4b0d-a925-97d89f945299" containerName="oc" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.909143 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="77fa1559-ac28-4b0d-a925-97d89f945299" containerName="oc" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.909317 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="77fa1559-ac28-4b0d-a925-97d89f945299" containerName="oc" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.909329 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="e355e210-9abe-4bdf-bcbf-70e95e437482" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.909923 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.913039 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.913316 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.913626 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.917941 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.918396 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.918597 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.919405 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:34:31 crc kubenswrapper[4580]: I0321 05:34:31.923937 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h"] Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.015864 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.015909 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.015938 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.015960 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.016042 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.016081 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.016119 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.016150 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.016176 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.016201 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llg2x\" (UniqueName: \"kubernetes.io/projected/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-kube-api-access-llg2x\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.016239 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.118039 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.118099 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.118127 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.118156 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.118204 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.118239 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.118262 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.118288 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.118307 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.118337 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llg2x\" (UniqueName: \"kubernetes.io/projected/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-kube-api-access-llg2x\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.118362 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.119845 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.123935 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.124578 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.125189 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.127046 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.127355 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.127908 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.128371 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.130981 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.136547 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.140953 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llg2x\" (UniqueName: \"kubernetes.io/projected/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-kube-api-access-llg2x\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gxx9h\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.265598 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.826970 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h"] Mar 21 05:34:32 crc kubenswrapper[4580]: I0321 05:34:32.831432 4580 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:34:33 crc kubenswrapper[4580]: I0321 05:34:33.814900 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" event={"ID":"bf805790-d6ce-495d-8d85-dd7cf68b4bf3","Type":"ContainerStarted","Data":"ae3fa367c4c5360fe9d61c13d51140bdb1052cbd9c742b237c22c776faad572f"} Mar 21 05:34:35 crc kubenswrapper[4580]: I0321 05:34:35.629011 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:34:35 crc kubenswrapper[4580]: E0321 05:34:35.629733 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:34:37 crc kubenswrapper[4580]: I0321 05:34:37.358276 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:34:37 crc kubenswrapper[4580]: I0321 05:34:37.847002 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" event={"ID":"bf805790-d6ce-495d-8d85-dd7cf68b4bf3","Type":"ContainerStarted","Data":"51a88ab4f44c7c529b97c815ea02c30068104bc40f4fdb197bfb59bbb8c5fc6a"} Mar 21 05:34:37 crc kubenswrapper[4580]: I0321 05:34:37.874500 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" podStartSLOduration=2.350349015 podStartE2EDuration="6.874479707s" podCreationTimestamp="2026-03-21 05:34:31 +0000 UTC" firstStartedPulling="2026-03-21 05:34:32.831215771 +0000 UTC m=+2577.913799399" lastFinishedPulling="2026-03-21 05:34:37.355346463 +0000 UTC m=+2582.437930091" observedRunningTime="2026-03-21 05:34:37.870123749 +0000 UTC m=+2582.952707377" watchObservedRunningTime="2026-03-21 05:34:37.874479707 +0000 UTC m=+2582.957063335" Mar 21 05:34:48 crc kubenswrapper[4580]: I0321 05:34:48.618367 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:34:48 crc kubenswrapper[4580]: E0321 05:34:48.619155 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:35:00 crc kubenswrapper[4580]: I0321 05:35:00.618244 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:35:00 crc kubenswrapper[4580]: E0321 05:35:00.619090 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:35:11 crc kubenswrapper[4580]: I0321 05:35:11.619288 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:35:11 crc kubenswrapper[4580]: E0321 05:35:11.620126 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:35:22 crc kubenswrapper[4580]: I0321 05:35:22.617821 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:35:22 crc kubenswrapper[4580]: E0321 05:35:22.618527 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:35:37 crc kubenswrapper[4580]: I0321 05:35:37.617619 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:35:37 crc kubenswrapper[4580]: E0321 05:35:37.618581 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:35:51 crc kubenswrapper[4580]: I0321 05:35:51.617380 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:35:51 crc kubenswrapper[4580]: E0321 05:35:51.618170 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:36:00 crc kubenswrapper[4580]: I0321 05:36:00.150532 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567856-6ckk9"] Mar 21 05:36:00 crc kubenswrapper[4580]: I0321 05:36:00.152213 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-6ckk9" Mar 21 05:36:00 crc kubenswrapper[4580]: I0321 05:36:00.155191 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:36:00 crc kubenswrapper[4580]: I0321 05:36:00.155424 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:36:00 crc kubenswrapper[4580]: I0321 05:36:00.155771 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:36:00 crc kubenswrapper[4580]: I0321 05:36:00.166398 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-6ckk9"] Mar 21 05:36:00 crc kubenswrapper[4580]: I0321 05:36:00.298304 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8knmj\" (UniqueName: \"kubernetes.io/projected/e851aeb4-208d-4952-8135-b3284af9d30f-kube-api-access-8knmj\") pod \"auto-csr-approver-29567856-6ckk9\" (UID: \"e851aeb4-208d-4952-8135-b3284af9d30f\") " pod="openshift-infra/auto-csr-approver-29567856-6ckk9" Mar 21 05:36:00 crc kubenswrapper[4580]: I0321 05:36:00.401125 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8knmj\" (UniqueName: \"kubernetes.io/projected/e851aeb4-208d-4952-8135-b3284af9d30f-kube-api-access-8knmj\") pod \"auto-csr-approver-29567856-6ckk9\" (UID: \"e851aeb4-208d-4952-8135-b3284af9d30f\") " pod="openshift-infra/auto-csr-approver-29567856-6ckk9" Mar 21 05:36:00 crc kubenswrapper[4580]: I0321 05:36:00.425120 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8knmj\" (UniqueName: \"kubernetes.io/projected/e851aeb4-208d-4952-8135-b3284af9d30f-kube-api-access-8knmj\") pod \"auto-csr-approver-29567856-6ckk9\" (UID: \"e851aeb4-208d-4952-8135-b3284af9d30f\") " pod="openshift-infra/auto-csr-approver-29567856-6ckk9" Mar 21 05:36:00 crc kubenswrapper[4580]: I0321 05:36:00.483576 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-6ckk9" Mar 21 05:36:00 crc kubenswrapper[4580]: I0321 05:36:00.918827 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-6ckk9"] Mar 21 05:36:01 crc kubenswrapper[4580]: I0321 05:36:01.592309 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567856-6ckk9" event={"ID":"e851aeb4-208d-4952-8135-b3284af9d30f","Type":"ContainerStarted","Data":"7bd29df4982730e78fe394e7dda84a2ce2d5e8ac39b53c6e729e7d9e682bb237"} Mar 21 05:36:02 crc kubenswrapper[4580]: I0321 05:36:02.602805 4580 generic.go:334] "Generic (PLEG): container finished" podID="e851aeb4-208d-4952-8135-b3284af9d30f" containerID="785f4c50746739aaf71b2500d7173d011116f9f7d3c832e82f59cd838b1fd70a" exitCode=0 Mar 21 05:36:02 crc kubenswrapper[4580]: I0321 05:36:02.602849 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567856-6ckk9" event={"ID":"e851aeb4-208d-4952-8135-b3284af9d30f","Type":"ContainerDied","Data":"785f4c50746739aaf71b2500d7173d011116f9f7d3c832e82f59cd838b1fd70a"} Mar 21 05:36:04 crc kubenswrapper[4580]: I0321 05:36:04.009156 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-6ckk9" Mar 21 05:36:04 crc kubenswrapper[4580]: I0321 05:36:04.186987 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8knmj\" (UniqueName: \"kubernetes.io/projected/e851aeb4-208d-4952-8135-b3284af9d30f-kube-api-access-8knmj\") pod \"e851aeb4-208d-4952-8135-b3284af9d30f\" (UID: \"e851aeb4-208d-4952-8135-b3284af9d30f\") " Mar 21 05:36:04 crc kubenswrapper[4580]: I0321 05:36:04.193255 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e851aeb4-208d-4952-8135-b3284af9d30f-kube-api-access-8knmj" (OuterVolumeSpecName: "kube-api-access-8knmj") pod "e851aeb4-208d-4952-8135-b3284af9d30f" (UID: "e851aeb4-208d-4952-8135-b3284af9d30f"). InnerVolumeSpecName "kube-api-access-8knmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:36:04 crc kubenswrapper[4580]: I0321 05:36:04.289614 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8knmj\" (UniqueName: \"kubernetes.io/projected/e851aeb4-208d-4952-8135-b3284af9d30f-kube-api-access-8knmj\") on node \"crc\" DevicePath \"\"" Mar 21 05:36:04 crc kubenswrapper[4580]: I0321 05:36:04.621038 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567856-6ckk9" event={"ID":"e851aeb4-208d-4952-8135-b3284af9d30f","Type":"ContainerDied","Data":"7bd29df4982730e78fe394e7dda84a2ce2d5e8ac39b53c6e729e7d9e682bb237"} Mar 21 05:36:04 crc kubenswrapper[4580]: I0321 05:36:04.621080 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bd29df4982730e78fe394e7dda84a2ce2d5e8ac39b53c6e729e7d9e682bb237" Mar 21 05:36:04 crc kubenswrapper[4580]: I0321 05:36:04.621140 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-6ckk9" Mar 21 05:36:05 crc kubenswrapper[4580]: I0321 05:36:05.085718 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-9dcqw"] Mar 21 05:36:05 crc kubenswrapper[4580]: I0321 05:36:05.096477 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-9dcqw"] Mar 21 05:36:05 crc kubenswrapper[4580]: I0321 05:36:05.625493 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:36:05 crc kubenswrapper[4580]: E0321 05:36:05.625875 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:36:05 crc kubenswrapper[4580]: I0321 05:36:05.629973 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0bb77cf-8c33-4f71-b2ba-8ad78fd21467" path="/var/lib/kubelet/pods/b0bb77cf-8c33-4f71-b2ba-8ad78fd21467/volumes" Mar 21 05:36:12 crc kubenswrapper[4580]: I0321 05:36:12.194102 4580 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xgkxr container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 05:36:12 crc kubenswrapper[4580]: I0321 05:36:12.194703 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" podUID="d9cce850-4a50-4a52-ac9b-147fcbde086a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:36:12 crc kubenswrapper[4580]: I0321 05:36:12.195955 4580 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xgkxr container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 05:36:12 crc kubenswrapper[4580]: I0321 05:36:12.195990 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xgkxr" podUID="d9cce850-4a50-4a52-ac9b-147fcbde086a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 05:36:14 crc kubenswrapper[4580]: I0321 05:36:14.463573 4580 scope.go:117] "RemoveContainer" containerID="30bed1e95a5e72b436b1e97c397d6b82dea02f5a02db029106185e3efed54d59" Mar 21 05:36:17 crc kubenswrapper[4580]: I0321 05:36:17.618631 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:36:17 crc kubenswrapper[4580]: E0321 05:36:17.620298 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:36:30 crc kubenswrapper[4580]: I0321 05:36:30.617946 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:36:30 crc kubenswrapper[4580]: E0321 05:36:30.618701 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.150349 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dd6px"] Mar 21 05:36:42 crc kubenswrapper[4580]: E0321 05:36:42.151423 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e851aeb4-208d-4952-8135-b3284af9d30f" containerName="oc" Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.151439 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e851aeb4-208d-4952-8135-b3284af9d30f" containerName="oc" Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.151695 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="e851aeb4-208d-4952-8135-b3284af9d30f" containerName="oc" Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.153730 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.168616 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dd6px"] Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.196221 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-catalog-content\") pod \"redhat-operators-dd6px\" (UID: \"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff\") " pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.196406 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-utilities\") pod \"redhat-operators-dd6px\" (UID: \"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff\") " pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.196423 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tnww\" (UniqueName: \"kubernetes.io/projected/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-kube-api-access-8tnww\") pod \"redhat-operators-dd6px\" (UID: \"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff\") " pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.297901 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tnww\" (UniqueName: \"kubernetes.io/projected/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-kube-api-access-8tnww\") pod \"redhat-operators-dd6px\" (UID: \"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff\") " pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.298435 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-utilities\") pod \"redhat-operators-dd6px\" (UID: \"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff\") " pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.298548 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-catalog-content\") pod \"redhat-operators-dd6px\" (UID: \"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff\") " pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.299156 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-catalog-content\") pod \"redhat-operators-dd6px\" (UID: \"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff\") " pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.299225 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-utilities\") pod \"redhat-operators-dd6px\" (UID: \"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff\") " pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.328591 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tnww\" (UniqueName: \"kubernetes.io/projected/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-kube-api-access-8tnww\") pod \"redhat-operators-dd6px\" (UID: \"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff\") " pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.491183 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:36:42 crc kubenswrapper[4580]: I0321 05:36:42.987597 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dd6px"] Mar 21 05:36:43 crc kubenswrapper[4580]: I0321 05:36:43.988968 4580 generic.go:334] "Generic (PLEG): container finished" podID="efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" containerID="e5cf0009571c9c4845b09aaa930d7525b316193ba276544407793354499a13bb" exitCode=0 Mar 21 05:36:43 crc kubenswrapper[4580]: I0321 05:36:43.989205 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd6px" event={"ID":"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff","Type":"ContainerDied","Data":"e5cf0009571c9c4845b09aaa930d7525b316193ba276544407793354499a13bb"} Mar 21 05:36:43 crc kubenswrapper[4580]: I0321 05:36:43.989228 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd6px" event={"ID":"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff","Type":"ContainerStarted","Data":"4917fc26345a0fad328502dfb4caa3989079fac33fe72d6bf07504503f013e84"} Mar 21 05:36:44 crc kubenswrapper[4580]: I0321 05:36:44.618673 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:36:44 crc kubenswrapper[4580]: E0321 05:36:44.619397 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:36:46 crc kubenswrapper[4580]: I0321 05:36:46.006519 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd6px" event={"ID":"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff","Type":"ContainerStarted","Data":"49b0f0866d59ae8d2eec8a7458475e1b1cf74bd0e558947667858d02b77c51d8"} Mar 21 05:36:55 crc kubenswrapper[4580]: I0321 05:36:55.075229 4580 generic.go:334] "Generic (PLEG): container finished" podID="efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" containerID="49b0f0866d59ae8d2eec8a7458475e1b1cf74bd0e558947667858d02b77c51d8" exitCode=0 Mar 21 05:36:55 crc kubenswrapper[4580]: I0321 05:36:55.075314 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd6px" event={"ID":"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff","Type":"ContainerDied","Data":"49b0f0866d59ae8d2eec8a7458475e1b1cf74bd0e558947667858d02b77c51d8"} Mar 21 05:36:56 crc kubenswrapper[4580]: I0321 05:36:56.091273 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd6px" event={"ID":"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff","Type":"ContainerStarted","Data":"32d1c6c0b5d5ee725ac96df1459c2bc2b23ec6c2872dce43e05585ee587b30ba"} Mar 21 05:36:56 crc kubenswrapper[4580]: I0321 05:36:56.116769 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dd6px" podStartSLOduration=2.597292054 podStartE2EDuration="14.116751845s" podCreationTimestamp="2026-03-21 05:36:42 +0000 UTC" firstStartedPulling="2026-03-21 05:36:43.991012514 +0000 UTC m=+2709.073596142" lastFinishedPulling="2026-03-21 05:36:55.510472305 +0000 UTC m=+2720.593055933" observedRunningTime="2026-03-21 05:36:56.108342837 +0000 UTC m=+2721.190926485" watchObservedRunningTime="2026-03-21 05:36:56.116751845 +0000 UTC m=+2721.199335473" Mar 21 05:36:58 crc kubenswrapper[4580]: I0321 05:36:58.618650 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:36:58 crc kubenswrapper[4580]: E0321 05:36:58.619323 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:37:02 crc kubenswrapper[4580]: I0321 05:37:02.491449 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:37:02 crc kubenswrapper[4580]: I0321 05:37:02.492113 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:37:03 crc kubenswrapper[4580]: I0321 05:37:03.532063 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dd6px" podUID="efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" containerName="registry-server" probeResult="failure" output=< Mar 21 05:37:03 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:37:03 crc kubenswrapper[4580]: > Mar 21 05:37:11 crc kubenswrapper[4580]: I0321 05:37:11.617460 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:37:11 crc kubenswrapper[4580]: E0321 05:37:11.618333 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:37:13 crc kubenswrapper[4580]: I0321 05:37:13.536549 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dd6px" podUID="efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" containerName="registry-server" probeResult="failure" output=< Mar 21 05:37:13 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:37:13 crc kubenswrapper[4580]: > Mar 21 05:37:20 crc kubenswrapper[4580]: I0321 05:37:20.318097 4580 generic.go:334] "Generic (PLEG): container finished" podID="bf805790-d6ce-495d-8d85-dd7cf68b4bf3" containerID="51a88ab4f44c7c529b97c815ea02c30068104bc40f4fdb197bfb59bbb8c5fc6a" exitCode=0 Mar 21 05:37:20 crc kubenswrapper[4580]: I0321 05:37:20.318204 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" event={"ID":"bf805790-d6ce-495d-8d85-dd7cf68b4bf3","Type":"ContainerDied","Data":"51a88ab4f44c7c529b97c815ea02c30068104bc40f4fdb197bfb59bbb8c5fc6a"} Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.769668 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.920102 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-migration-ssh-key-0\") pod \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.920157 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-inventory\") pod \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.920182 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-0\") pod \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.920246 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llg2x\" (UniqueName: \"kubernetes.io/projected/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-kube-api-access-llg2x\") pod \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.920362 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-combined-ca-bundle\") pod \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.920417 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-ssh-key-openstack-edpm-ipam\") pod \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.920443 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-3\") pod \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.920494 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-1\") pod \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.920531 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-2\") pod \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.920557 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-migration-ssh-key-1\") pod \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.920607 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-extra-config-0\") pod \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\" (UID: \"bf805790-d6ce-495d-8d85-dd7cf68b4bf3\") " Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.939873 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "bf805790-d6ce-495d-8d85-dd7cf68b4bf3" (UID: "bf805790-d6ce-495d-8d85-dd7cf68b4bf3"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.951601 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-kube-api-access-llg2x" (OuterVolumeSpecName: "kube-api-access-llg2x") pod "bf805790-d6ce-495d-8d85-dd7cf68b4bf3" (UID: "bf805790-d6ce-495d-8d85-dd7cf68b4bf3"). InnerVolumeSpecName "kube-api-access-llg2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.969985 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bf805790-d6ce-495d-8d85-dd7cf68b4bf3" (UID: "bf805790-d6ce-495d-8d85-dd7cf68b4bf3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.973506 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-inventory" (OuterVolumeSpecName: "inventory") pod "bf805790-d6ce-495d-8d85-dd7cf68b4bf3" (UID: "bf805790-d6ce-495d-8d85-dd7cf68b4bf3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.992794 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "bf805790-d6ce-495d-8d85-dd7cf68b4bf3" (UID: "bf805790-d6ce-495d-8d85-dd7cf68b4bf3"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:37:21 crc kubenswrapper[4580]: I0321 05:37:21.995299 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "bf805790-d6ce-495d-8d85-dd7cf68b4bf3" (UID: "bf805790-d6ce-495d-8d85-dd7cf68b4bf3"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.002760 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "bf805790-d6ce-495d-8d85-dd7cf68b4bf3" (UID: "bf805790-d6ce-495d-8d85-dd7cf68b4bf3"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.012168 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "bf805790-d6ce-495d-8d85-dd7cf68b4bf3" (UID: "bf805790-d6ce-495d-8d85-dd7cf68b4bf3"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.022129 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "bf805790-d6ce-495d-8d85-dd7cf68b4bf3" (UID: "bf805790-d6ce-495d-8d85-dd7cf68b4bf3"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.023832 4580 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.023866 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.023879 4580 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.023891 4580 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.023901 4580 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.023915 4580 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.023927 4580 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.023939 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.023950 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llg2x\" (UniqueName: \"kubernetes.io/projected/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-kube-api-access-llg2x\") on node \"crc\" DevicePath \"\"" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.026985 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "bf805790-d6ce-495d-8d85-dd7cf68b4bf3" (UID: "bf805790-d6ce-495d-8d85-dd7cf68b4bf3"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.028100 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "bf805790-d6ce-495d-8d85-dd7cf68b4bf3" (UID: "bf805790-d6ce-495d-8d85-dd7cf68b4bf3"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.127033 4580 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.127102 4580 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bf805790-d6ce-495d-8d85-dd7cf68b4bf3-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.335648 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" event={"ID":"bf805790-d6ce-495d-8d85-dd7cf68b4bf3","Type":"ContainerDied","Data":"ae3fa367c4c5360fe9d61c13d51140bdb1052cbd9c742b237c22c776faad572f"} Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.335686 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gxx9h" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.335687 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae3fa367c4c5360fe9d61c13d51140bdb1052cbd9c742b237c22c776faad572f" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.478292 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666"] Mar 21 05:37:22 crc kubenswrapper[4580]: E0321 05:37:22.478690 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf805790-d6ce-495d-8d85-dd7cf68b4bf3" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.478706 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf805790-d6ce-495d-8d85-dd7cf68b4bf3" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.478953 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf805790-d6ce-495d-8d85-dd7cf68b4bf3" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.479565 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.482279 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.483515 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.484144 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8ljw5" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.485477 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.485487 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.490138 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666"] Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.637448 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.637511 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.637540 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.637558 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4fgk\" (UniqueName: \"kubernetes.io/projected/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-kube-api-access-c4fgk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.637610 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.637671 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.638013 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.739641 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.739709 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.739808 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.739884 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.739922 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.739968 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.739985 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4fgk\" (UniqueName: \"kubernetes.io/projected/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-kube-api-access-c4fgk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.745536 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.745623 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.745692 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.746212 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.754399 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.757457 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.760486 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4fgk\" (UniqueName: \"kubernetes.io/projected/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-kube-api-access-c4fgk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bc666\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:22 crc kubenswrapper[4580]: I0321 05:37:22.797297 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:37:23 crc kubenswrapper[4580]: I0321 05:37:23.385669 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666"] Mar 21 05:37:23 crc kubenswrapper[4580]: I0321 05:37:23.537301 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dd6px" podUID="efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" containerName="registry-server" probeResult="failure" output=< Mar 21 05:37:23 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:37:23 crc kubenswrapper[4580]: > Mar 21 05:37:24 crc kubenswrapper[4580]: I0321 05:37:24.353106 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" event={"ID":"53375dd9-0a2b-413f-8fa2-1ebd8d63df42","Type":"ContainerStarted","Data":"86278405e1a31d52776aa2b7591dcc4b472c70052f233fffddca85f3005b4194"} Mar 21 05:37:24 crc kubenswrapper[4580]: I0321 05:37:24.353435 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" event={"ID":"53375dd9-0a2b-413f-8fa2-1ebd8d63df42","Type":"ContainerStarted","Data":"93aee533e3e658cbe92b2a2bb8ce147d40f467e9dfef84be11e7d1fc58b908e7"} Mar 21 05:37:24 crc kubenswrapper[4580]: I0321 05:37:24.376334 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" podStartSLOduration=1.752500138 podStartE2EDuration="2.376315781s" podCreationTimestamp="2026-03-21 05:37:22 +0000 UTC" firstStartedPulling="2026-03-21 05:37:23.388940906 +0000 UTC m=+2748.471524534" lastFinishedPulling="2026-03-21 05:37:24.012756549 +0000 UTC m=+2749.095340177" observedRunningTime="2026-03-21 05:37:24.37370744 +0000 UTC m=+2749.456291078" watchObservedRunningTime="2026-03-21 05:37:24.376315781 +0000 UTC m=+2749.458899409" Mar 21 05:37:26 crc kubenswrapper[4580]: I0321 05:37:26.618550 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:37:26 crc kubenswrapper[4580]: E0321 05:37:26.619678 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:37:32 crc kubenswrapper[4580]: I0321 05:37:32.547940 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:37:32 crc kubenswrapper[4580]: I0321 05:37:32.610260 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:37:32 crc kubenswrapper[4580]: I0321 05:37:32.791710 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dd6px"] Mar 21 05:37:34 crc kubenswrapper[4580]: I0321 05:37:34.448229 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dd6px" podUID="efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" containerName="registry-server" containerID="cri-o://32d1c6c0b5d5ee725ac96df1459c2bc2b23ec6c2872dce43e05585ee587b30ba" gracePeriod=2 Mar 21 05:37:34 crc kubenswrapper[4580]: I0321 05:37:34.969522 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:34.997539 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tnww\" (UniqueName: \"kubernetes.io/projected/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-kube-api-access-8tnww\") pod \"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff\" (UID: \"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff\") " Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:34.997752 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-utilities\") pod \"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff\" (UID: \"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff\") " Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:34.997982 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-catalog-content\") pod \"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff\" (UID: \"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff\") " Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.001930 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-utilities" (OuterVolumeSpecName: "utilities") pod "efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" (UID: "efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.025034 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-kube-api-access-8tnww" (OuterVolumeSpecName: "kube-api-access-8tnww") pod "efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" (UID: "efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff"). InnerVolumeSpecName "kube-api-access-8tnww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.100394 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tnww\" (UniqueName: \"kubernetes.io/projected/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-kube-api-access-8tnww\") on node \"crc\" DevicePath \"\"" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.100800 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.144302 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" (UID: "efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.202530 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.462221 4580 generic.go:334] "Generic (PLEG): container finished" podID="efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" containerID="32d1c6c0b5d5ee725ac96df1459c2bc2b23ec6c2872dce43e05585ee587b30ba" exitCode=0 Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.462281 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd6px" event={"ID":"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff","Type":"ContainerDied","Data":"32d1c6c0b5d5ee725ac96df1459c2bc2b23ec6c2872dce43e05585ee587b30ba"} Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.462415 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd6px" event={"ID":"efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff","Type":"ContainerDied","Data":"4917fc26345a0fad328502dfb4caa3989079fac33fe72d6bf07504503f013e84"} Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.462447 4580 scope.go:117] "RemoveContainer" containerID="32d1c6c0b5d5ee725ac96df1459c2bc2b23ec6c2872dce43e05585ee587b30ba" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.462357 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd6px" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.494953 4580 scope.go:117] "RemoveContainer" containerID="49b0f0866d59ae8d2eec8a7458475e1b1cf74bd0e558947667858d02b77c51d8" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.499105 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dd6px"] Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.508436 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dd6px"] Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.535854 4580 scope.go:117] "RemoveContainer" containerID="e5cf0009571c9c4845b09aaa930d7525b316193ba276544407793354499a13bb" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.560661 4580 scope.go:117] "RemoveContainer" containerID="32d1c6c0b5d5ee725ac96df1459c2bc2b23ec6c2872dce43e05585ee587b30ba" Mar 21 05:37:35 crc kubenswrapper[4580]: E0321 05:37:35.561052 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32d1c6c0b5d5ee725ac96df1459c2bc2b23ec6c2872dce43e05585ee587b30ba\": container with ID starting with 32d1c6c0b5d5ee725ac96df1459c2bc2b23ec6c2872dce43e05585ee587b30ba not found: ID does not exist" containerID="32d1c6c0b5d5ee725ac96df1459c2bc2b23ec6c2872dce43e05585ee587b30ba" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.561083 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d1c6c0b5d5ee725ac96df1459c2bc2b23ec6c2872dce43e05585ee587b30ba"} err="failed to get container status \"32d1c6c0b5d5ee725ac96df1459c2bc2b23ec6c2872dce43e05585ee587b30ba\": rpc error: code = NotFound desc = could not find container \"32d1c6c0b5d5ee725ac96df1459c2bc2b23ec6c2872dce43e05585ee587b30ba\": container with ID starting with 32d1c6c0b5d5ee725ac96df1459c2bc2b23ec6c2872dce43e05585ee587b30ba not found: ID does not exist" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.561103 4580 scope.go:117] "RemoveContainer" containerID="49b0f0866d59ae8d2eec8a7458475e1b1cf74bd0e558947667858d02b77c51d8" Mar 21 05:37:35 crc kubenswrapper[4580]: E0321 05:37:35.561361 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49b0f0866d59ae8d2eec8a7458475e1b1cf74bd0e558947667858d02b77c51d8\": container with ID starting with 49b0f0866d59ae8d2eec8a7458475e1b1cf74bd0e558947667858d02b77c51d8 not found: ID does not exist" containerID="49b0f0866d59ae8d2eec8a7458475e1b1cf74bd0e558947667858d02b77c51d8" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.561383 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b0f0866d59ae8d2eec8a7458475e1b1cf74bd0e558947667858d02b77c51d8"} err="failed to get container status \"49b0f0866d59ae8d2eec8a7458475e1b1cf74bd0e558947667858d02b77c51d8\": rpc error: code = NotFound desc = could not find container \"49b0f0866d59ae8d2eec8a7458475e1b1cf74bd0e558947667858d02b77c51d8\": container with ID starting with 49b0f0866d59ae8d2eec8a7458475e1b1cf74bd0e558947667858d02b77c51d8 not found: ID does not exist" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.561402 4580 scope.go:117] "RemoveContainer" containerID="e5cf0009571c9c4845b09aaa930d7525b316193ba276544407793354499a13bb" Mar 21 05:37:35 crc kubenswrapper[4580]: E0321 05:37:35.561775 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5cf0009571c9c4845b09aaa930d7525b316193ba276544407793354499a13bb\": container with ID starting with e5cf0009571c9c4845b09aaa930d7525b316193ba276544407793354499a13bb not found: ID does not exist" containerID="e5cf0009571c9c4845b09aaa930d7525b316193ba276544407793354499a13bb" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.561814 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5cf0009571c9c4845b09aaa930d7525b316193ba276544407793354499a13bb"} err="failed to get container status \"e5cf0009571c9c4845b09aaa930d7525b316193ba276544407793354499a13bb\": rpc error: code = NotFound desc = could not find container \"e5cf0009571c9c4845b09aaa930d7525b316193ba276544407793354499a13bb\": container with ID starting with e5cf0009571c9c4845b09aaa930d7525b316193ba276544407793354499a13bb not found: ID does not exist" Mar 21 05:37:35 crc kubenswrapper[4580]: I0321 05:37:35.636485 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" path="/var/lib/kubelet/pods/efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff/volumes" Mar 21 05:37:41 crc kubenswrapper[4580]: I0321 05:37:41.618440 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:37:41 crc kubenswrapper[4580]: E0321 05:37:41.619156 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:37:56 crc kubenswrapper[4580]: I0321 05:37:56.617762 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:37:56 crc kubenswrapper[4580]: E0321 05:37:56.618503 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:38:00 crc kubenswrapper[4580]: I0321 05:38:00.146467 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567858-g2wsm"] Mar 21 05:38:00 crc kubenswrapper[4580]: E0321 05:38:00.147300 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" containerName="extract-utilities" Mar 21 05:38:00 crc kubenswrapper[4580]: I0321 05:38:00.147312 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" containerName="extract-utilities" Mar 21 05:38:00 crc kubenswrapper[4580]: E0321 05:38:00.147346 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" containerName="extract-content" Mar 21 05:38:00 crc kubenswrapper[4580]: I0321 05:38:00.147352 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" containerName="extract-content" Mar 21 05:38:00 crc kubenswrapper[4580]: E0321 05:38:00.147361 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" containerName="registry-server" Mar 21 05:38:00 crc kubenswrapper[4580]: I0321 05:38:00.147368 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" containerName="registry-server" Mar 21 05:38:00 crc kubenswrapper[4580]: I0321 05:38:00.147557 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="efbb7f7e-e7f0-48ac-a5e4-6cfb8003cbff" containerName="registry-server" Mar 21 05:38:00 crc kubenswrapper[4580]: I0321 05:38:00.148194 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-g2wsm" Mar 21 05:38:00 crc kubenswrapper[4580]: I0321 05:38:00.150736 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:38:00 crc kubenswrapper[4580]: I0321 05:38:00.151133 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:38:00 crc kubenswrapper[4580]: I0321 05:38:00.152315 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:38:00 crc kubenswrapper[4580]: I0321 05:38:00.159285 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567858-g2wsm"] Mar 21 05:38:00 crc kubenswrapper[4580]: I0321 05:38:00.198390 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn4x9\" (UniqueName: \"kubernetes.io/projected/e4bd29d6-3fa8-44dd-993a-47feb6717d75-kube-api-access-dn4x9\") pod \"auto-csr-approver-29567858-g2wsm\" (UID: \"e4bd29d6-3fa8-44dd-993a-47feb6717d75\") " pod="openshift-infra/auto-csr-approver-29567858-g2wsm" Mar 21 05:38:00 crc kubenswrapper[4580]: I0321 05:38:00.299728 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn4x9\" (UniqueName: \"kubernetes.io/projected/e4bd29d6-3fa8-44dd-993a-47feb6717d75-kube-api-access-dn4x9\") pod \"auto-csr-approver-29567858-g2wsm\" (UID: \"e4bd29d6-3fa8-44dd-993a-47feb6717d75\") " pod="openshift-infra/auto-csr-approver-29567858-g2wsm" Mar 21 05:38:00 crc kubenswrapper[4580]: I0321 05:38:00.325668 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn4x9\" (UniqueName: \"kubernetes.io/projected/e4bd29d6-3fa8-44dd-993a-47feb6717d75-kube-api-access-dn4x9\") pod \"auto-csr-approver-29567858-g2wsm\" (UID: \"e4bd29d6-3fa8-44dd-993a-47feb6717d75\") " pod="openshift-infra/auto-csr-approver-29567858-g2wsm" Mar 21 05:38:00 crc kubenswrapper[4580]: I0321 05:38:00.470270 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-g2wsm" Mar 21 05:38:01 crc kubenswrapper[4580]: I0321 05:38:01.602230 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567858-g2wsm"] Mar 21 05:38:01 crc kubenswrapper[4580]: I0321 05:38:01.765691 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567858-g2wsm" event={"ID":"e4bd29d6-3fa8-44dd-993a-47feb6717d75","Type":"ContainerStarted","Data":"b7a7705788a0f306e4bae42da7b3798bb817b132a3318231c78bffc98293dc52"} Mar 21 05:38:02 crc kubenswrapper[4580]: I0321 05:38:02.773793 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567858-g2wsm" event={"ID":"e4bd29d6-3fa8-44dd-993a-47feb6717d75","Type":"ContainerStarted","Data":"dc703c7158f720e2843927c66b82bae55b4756500eef5f915346390e83ebfd00"} Mar 21 05:38:03 crc kubenswrapper[4580]: I0321 05:38:03.783891 4580 generic.go:334] "Generic (PLEG): container finished" podID="e4bd29d6-3fa8-44dd-993a-47feb6717d75" containerID="dc703c7158f720e2843927c66b82bae55b4756500eef5f915346390e83ebfd00" exitCode=0 Mar 21 05:38:03 crc kubenswrapper[4580]: I0321 05:38:03.784034 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567858-g2wsm" event={"ID":"e4bd29d6-3fa8-44dd-993a-47feb6717d75","Type":"ContainerDied","Data":"dc703c7158f720e2843927c66b82bae55b4756500eef5f915346390e83ebfd00"} Mar 21 05:38:05 crc kubenswrapper[4580]: I0321 05:38:05.146832 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-g2wsm" Mar 21 05:38:05 crc kubenswrapper[4580]: I0321 05:38:05.306355 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn4x9\" (UniqueName: \"kubernetes.io/projected/e4bd29d6-3fa8-44dd-993a-47feb6717d75-kube-api-access-dn4x9\") pod \"e4bd29d6-3fa8-44dd-993a-47feb6717d75\" (UID: \"e4bd29d6-3fa8-44dd-993a-47feb6717d75\") " Mar 21 05:38:05 crc kubenswrapper[4580]: I0321 05:38:05.311055 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4bd29d6-3fa8-44dd-993a-47feb6717d75-kube-api-access-dn4x9" (OuterVolumeSpecName: "kube-api-access-dn4x9") pod "e4bd29d6-3fa8-44dd-993a-47feb6717d75" (UID: "e4bd29d6-3fa8-44dd-993a-47feb6717d75"). InnerVolumeSpecName "kube-api-access-dn4x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:38:05 crc kubenswrapper[4580]: I0321 05:38:05.408954 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn4x9\" (UniqueName: \"kubernetes.io/projected/e4bd29d6-3fa8-44dd-993a-47feb6717d75-kube-api-access-dn4x9\") on node \"crc\" DevicePath \"\"" Mar 21 05:38:05 crc kubenswrapper[4580]: I0321 05:38:05.801770 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567858-g2wsm" event={"ID":"e4bd29d6-3fa8-44dd-993a-47feb6717d75","Type":"ContainerDied","Data":"b7a7705788a0f306e4bae42da7b3798bb817b132a3318231c78bffc98293dc52"} Mar 21 05:38:05 crc kubenswrapper[4580]: I0321 05:38:05.801835 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7a7705788a0f306e4bae42da7b3798bb817b132a3318231c78bffc98293dc52" Mar 21 05:38:05 crc kubenswrapper[4580]: I0321 05:38:05.801876 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-g2wsm" Mar 21 05:38:05 crc kubenswrapper[4580]: I0321 05:38:05.867099 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-gc7zd"] Mar 21 05:38:05 crc kubenswrapper[4580]: I0321 05:38:05.874607 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-gc7zd"] Mar 21 05:38:07 crc kubenswrapper[4580]: I0321 05:38:07.618428 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:38:07 crc kubenswrapper[4580]: E0321 05:38:07.618990 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:38:07 crc kubenswrapper[4580]: I0321 05:38:07.630110 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa25cff0-b73c-49be-aabf-0215e64a0cbb" path="/var/lib/kubelet/pods/fa25cff0-b73c-49be-aabf-0215e64a0cbb/volumes" Mar 21 05:38:14 crc kubenswrapper[4580]: I0321 05:38:14.562881 4580 scope.go:117] "RemoveContainer" containerID="258d92896bb4f8e759af6cd8e8ba30c66812606ca78ec41821efcde13e262dc2" Mar 21 05:38:19 crc kubenswrapper[4580]: I0321 05:38:19.617472 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:38:19 crc kubenswrapper[4580]: E0321 05:38:19.618184 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:38:30 crc kubenswrapper[4580]: I0321 05:38:30.618033 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:38:30 crc kubenswrapper[4580]: E0321 05:38:30.619013 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:38:44 crc kubenswrapper[4580]: I0321 05:38:44.618583 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:38:44 crc kubenswrapper[4580]: E0321 05:38:44.619391 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:38:56 crc kubenswrapper[4580]: I0321 05:38:56.617886 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:38:57 crc kubenswrapper[4580]: I0321 05:38:57.272929 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"7da25fc56e97732a5d1594544cf7d6bd189dbc61b5ca3f85d33f7ac4406a856c"} Mar 21 05:39:03 crc kubenswrapper[4580]: I0321 05:39:03.609427 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ppdw6"] Mar 21 05:39:03 crc kubenswrapper[4580]: E0321 05:39:03.610472 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4bd29d6-3fa8-44dd-993a-47feb6717d75" containerName="oc" Mar 21 05:39:03 crc kubenswrapper[4580]: I0321 05:39:03.610491 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4bd29d6-3fa8-44dd-993a-47feb6717d75" containerName="oc" Mar 21 05:39:03 crc kubenswrapper[4580]: I0321 05:39:03.610733 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4bd29d6-3fa8-44dd-993a-47feb6717d75" containerName="oc" Mar 21 05:39:03 crc kubenswrapper[4580]: I0321 05:39:03.612677 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:03 crc kubenswrapper[4580]: I0321 05:39:03.630812 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ppdw6"] Mar 21 05:39:03 crc kubenswrapper[4580]: I0321 05:39:03.700799 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-utilities\") pod \"community-operators-ppdw6\" (UID: \"54e6ab65-a6d7-401a-8a7d-4dda4b221e93\") " pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:03 crc kubenswrapper[4580]: I0321 05:39:03.700867 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-catalog-content\") pod \"community-operators-ppdw6\" (UID: \"54e6ab65-a6d7-401a-8a7d-4dda4b221e93\") " pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:03 crc kubenswrapper[4580]: I0321 05:39:03.701023 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-849xt\" (UniqueName: \"kubernetes.io/projected/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-kube-api-access-849xt\") pod \"community-operators-ppdw6\" (UID: \"54e6ab65-a6d7-401a-8a7d-4dda4b221e93\") " pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:03 crc kubenswrapper[4580]: I0321 05:39:03.802667 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-utilities\") pod \"community-operators-ppdw6\" (UID: \"54e6ab65-a6d7-401a-8a7d-4dda4b221e93\") " pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:03 crc kubenswrapper[4580]: I0321 05:39:03.802734 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-catalog-content\") pod \"community-operators-ppdw6\" (UID: \"54e6ab65-a6d7-401a-8a7d-4dda4b221e93\") " pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:03 crc kubenswrapper[4580]: I0321 05:39:03.802772 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-849xt\" (UniqueName: \"kubernetes.io/projected/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-kube-api-access-849xt\") pod \"community-operators-ppdw6\" (UID: \"54e6ab65-a6d7-401a-8a7d-4dda4b221e93\") " pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:03 crc kubenswrapper[4580]: I0321 05:39:03.803268 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-utilities\") pod \"community-operators-ppdw6\" (UID: \"54e6ab65-a6d7-401a-8a7d-4dda4b221e93\") " pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:03 crc kubenswrapper[4580]: I0321 05:39:03.803383 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-catalog-content\") pod \"community-operators-ppdw6\" (UID: \"54e6ab65-a6d7-401a-8a7d-4dda4b221e93\") " pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:03 crc kubenswrapper[4580]: I0321 05:39:03.826318 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-849xt\" (UniqueName: \"kubernetes.io/projected/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-kube-api-access-849xt\") pod \"community-operators-ppdw6\" (UID: \"54e6ab65-a6d7-401a-8a7d-4dda4b221e93\") " pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:03 crc kubenswrapper[4580]: I0321 05:39:03.934135 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:04 crc kubenswrapper[4580]: I0321 05:39:04.520348 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ppdw6"] Mar 21 05:39:05 crc kubenswrapper[4580]: I0321 05:39:05.347997 4580 generic.go:334] "Generic (PLEG): container finished" podID="54e6ab65-a6d7-401a-8a7d-4dda4b221e93" containerID="90ff3c82e1d93e4a191e74388520053567d61aaf2857fc433a4c0eac99a441fa" exitCode=0 Mar 21 05:39:05 crc kubenswrapper[4580]: I0321 05:39:05.348060 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppdw6" event={"ID":"54e6ab65-a6d7-401a-8a7d-4dda4b221e93","Type":"ContainerDied","Data":"90ff3c82e1d93e4a191e74388520053567d61aaf2857fc433a4c0eac99a441fa"} Mar 21 05:39:05 crc kubenswrapper[4580]: I0321 05:39:05.348313 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppdw6" event={"ID":"54e6ab65-a6d7-401a-8a7d-4dda4b221e93","Type":"ContainerStarted","Data":"ebb0720f224b931101f5559f312e01e382bb84b99c5abd8988264598c4083bfd"} Mar 21 05:39:06 crc kubenswrapper[4580]: I0321 05:39:06.359154 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppdw6" event={"ID":"54e6ab65-a6d7-401a-8a7d-4dda4b221e93","Type":"ContainerStarted","Data":"28ab1206fff69ecca6d1dda6c91898cc3d27461b2fe1652bfecd88bc62d8433d"} Mar 21 05:39:08 crc kubenswrapper[4580]: I0321 05:39:08.379172 4580 generic.go:334] "Generic (PLEG): container finished" podID="54e6ab65-a6d7-401a-8a7d-4dda4b221e93" containerID="28ab1206fff69ecca6d1dda6c91898cc3d27461b2fe1652bfecd88bc62d8433d" exitCode=0 Mar 21 05:39:08 crc kubenswrapper[4580]: I0321 05:39:08.379254 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppdw6" event={"ID":"54e6ab65-a6d7-401a-8a7d-4dda4b221e93","Type":"ContainerDied","Data":"28ab1206fff69ecca6d1dda6c91898cc3d27461b2fe1652bfecd88bc62d8433d"} Mar 21 05:39:09 crc kubenswrapper[4580]: I0321 05:39:09.397081 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppdw6" event={"ID":"54e6ab65-a6d7-401a-8a7d-4dda4b221e93","Type":"ContainerStarted","Data":"4f37c7459784316eba20534af4eb4013b6ee4ac41b476c44b9bf032ffaf0754f"} Mar 21 05:39:09 crc kubenswrapper[4580]: I0321 05:39:09.419436 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ppdw6" podStartSLOduration=3.007752542 podStartE2EDuration="6.419416299s" podCreationTimestamp="2026-03-21 05:39:03 +0000 UTC" firstStartedPulling="2026-03-21 05:39:05.350272099 +0000 UTC m=+2850.432855727" lastFinishedPulling="2026-03-21 05:39:08.761935856 +0000 UTC m=+2853.844519484" observedRunningTime="2026-03-21 05:39:09.41833089 +0000 UTC m=+2854.500914538" watchObservedRunningTime="2026-03-21 05:39:09.419416299 +0000 UTC m=+2854.501999927" Mar 21 05:39:13 crc kubenswrapper[4580]: I0321 05:39:13.934482 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:13 crc kubenswrapper[4580]: I0321 05:39:13.935050 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:14 crc kubenswrapper[4580]: I0321 05:39:14.981494 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ppdw6" podUID="54e6ab65-a6d7-401a-8a7d-4dda4b221e93" containerName="registry-server" probeResult="failure" output=< Mar 21 05:39:14 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:39:14 crc kubenswrapper[4580]: > Mar 21 05:39:18 crc kubenswrapper[4580]: I0321 05:39:18.315741 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mhf4n"] Mar 21 05:39:18 crc kubenswrapper[4580]: I0321 05:39:18.318776 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:18 crc kubenswrapper[4580]: I0321 05:39:18.349968 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhf4n"] Mar 21 05:39:18 crc kubenswrapper[4580]: I0321 05:39:18.391989 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2931bd73-571a-44f4-9cd8-54ad9129c147-utilities\") pod \"certified-operators-mhf4n\" (UID: \"2931bd73-571a-44f4-9cd8-54ad9129c147\") " pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:18 crc kubenswrapper[4580]: I0321 05:39:18.392085 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57z4r\" (UniqueName: \"kubernetes.io/projected/2931bd73-571a-44f4-9cd8-54ad9129c147-kube-api-access-57z4r\") pod \"certified-operators-mhf4n\" (UID: \"2931bd73-571a-44f4-9cd8-54ad9129c147\") " pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:18 crc kubenswrapper[4580]: I0321 05:39:18.392152 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2931bd73-571a-44f4-9cd8-54ad9129c147-catalog-content\") pod \"certified-operators-mhf4n\" (UID: \"2931bd73-571a-44f4-9cd8-54ad9129c147\") " pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:18 crc kubenswrapper[4580]: I0321 05:39:18.494416 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2931bd73-571a-44f4-9cd8-54ad9129c147-utilities\") pod \"certified-operators-mhf4n\" (UID: \"2931bd73-571a-44f4-9cd8-54ad9129c147\") " pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:18 crc kubenswrapper[4580]: I0321 05:39:18.494521 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57z4r\" (UniqueName: \"kubernetes.io/projected/2931bd73-571a-44f4-9cd8-54ad9129c147-kube-api-access-57z4r\") pod \"certified-operators-mhf4n\" (UID: \"2931bd73-571a-44f4-9cd8-54ad9129c147\") " pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:18 crc kubenswrapper[4580]: I0321 05:39:18.494579 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2931bd73-571a-44f4-9cd8-54ad9129c147-catalog-content\") pod \"certified-operators-mhf4n\" (UID: \"2931bd73-571a-44f4-9cd8-54ad9129c147\") " pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:18 crc kubenswrapper[4580]: I0321 05:39:18.495458 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2931bd73-571a-44f4-9cd8-54ad9129c147-catalog-content\") pod \"certified-operators-mhf4n\" (UID: \"2931bd73-571a-44f4-9cd8-54ad9129c147\") " pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:18 crc kubenswrapper[4580]: I0321 05:39:18.495818 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2931bd73-571a-44f4-9cd8-54ad9129c147-utilities\") pod \"certified-operators-mhf4n\" (UID: \"2931bd73-571a-44f4-9cd8-54ad9129c147\") " pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:18 crc kubenswrapper[4580]: I0321 05:39:18.526990 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57z4r\" (UniqueName: \"kubernetes.io/projected/2931bd73-571a-44f4-9cd8-54ad9129c147-kube-api-access-57z4r\") pod \"certified-operators-mhf4n\" (UID: \"2931bd73-571a-44f4-9cd8-54ad9129c147\") " pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:18 crc kubenswrapper[4580]: I0321 05:39:18.647007 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:19 crc kubenswrapper[4580]: I0321 05:39:19.275457 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhf4n"] Mar 21 05:39:19 crc kubenswrapper[4580]: I0321 05:39:19.489523 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhf4n" event={"ID":"2931bd73-571a-44f4-9cd8-54ad9129c147","Type":"ContainerStarted","Data":"b98191809ed73b46bbc46c341b87bcc46cb9f6456c09bf993fb839aec7e20979"} Mar 21 05:39:20 crc kubenswrapper[4580]: I0321 05:39:20.500865 4580 generic.go:334] "Generic (PLEG): container finished" podID="2931bd73-571a-44f4-9cd8-54ad9129c147" containerID="0ab7dbb50d40e744ea2fc3fb4bb730415ca8fa395458f6936b08d859803cb6cf" exitCode=0 Mar 21 05:39:20 crc kubenswrapper[4580]: I0321 05:39:20.500969 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhf4n" event={"ID":"2931bd73-571a-44f4-9cd8-54ad9129c147","Type":"ContainerDied","Data":"0ab7dbb50d40e744ea2fc3fb4bb730415ca8fa395458f6936b08d859803cb6cf"} Mar 21 05:39:21 crc kubenswrapper[4580]: I0321 05:39:21.515002 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhf4n" event={"ID":"2931bd73-571a-44f4-9cd8-54ad9129c147","Type":"ContainerStarted","Data":"f5ad72aa51de85ecc1222e14aa5acf9678e3aa285c4a6dfe1ae1954b7caaa399"} Mar 21 05:39:23 crc kubenswrapper[4580]: I0321 05:39:23.535377 4580 generic.go:334] "Generic (PLEG): container finished" podID="2931bd73-571a-44f4-9cd8-54ad9129c147" containerID="f5ad72aa51de85ecc1222e14aa5acf9678e3aa285c4a6dfe1ae1954b7caaa399" exitCode=0 Mar 21 05:39:23 crc kubenswrapper[4580]: I0321 05:39:23.535415 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhf4n" event={"ID":"2931bd73-571a-44f4-9cd8-54ad9129c147","Type":"ContainerDied","Data":"f5ad72aa51de85ecc1222e14aa5acf9678e3aa285c4a6dfe1ae1954b7caaa399"} Mar 21 05:39:23 crc kubenswrapper[4580]: I0321 05:39:23.989610 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:24 crc kubenswrapper[4580]: I0321 05:39:24.067475 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:24 crc kubenswrapper[4580]: I0321 05:39:24.548724 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhf4n" event={"ID":"2931bd73-571a-44f4-9cd8-54ad9129c147","Type":"ContainerStarted","Data":"47db7335ed340f8b7135622bdd09ba4e63693a687e7de4887367cc0af8a8d14d"} Mar 21 05:39:24 crc kubenswrapper[4580]: I0321 05:39:24.573733 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mhf4n" podStartSLOduration=3.110196908 podStartE2EDuration="6.573713057s" podCreationTimestamp="2026-03-21 05:39:18 +0000 UTC" firstStartedPulling="2026-03-21 05:39:20.503595121 +0000 UTC m=+2865.586178749" lastFinishedPulling="2026-03-21 05:39:23.96711127 +0000 UTC m=+2869.049694898" observedRunningTime="2026-03-21 05:39:24.567607712 +0000 UTC m=+2869.650191340" watchObservedRunningTime="2026-03-21 05:39:24.573713057 +0000 UTC m=+2869.656296695" Mar 21 05:39:25 crc kubenswrapper[4580]: I0321 05:39:25.506754 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ppdw6"] Mar 21 05:39:25 crc kubenswrapper[4580]: I0321 05:39:25.556590 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ppdw6" podUID="54e6ab65-a6d7-401a-8a7d-4dda4b221e93" containerName="registry-server" containerID="cri-o://4f37c7459784316eba20534af4eb4013b6ee4ac41b476c44b9bf032ffaf0754f" gracePeriod=2 Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.128986 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.244572 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-catalog-content\") pod \"54e6ab65-a6d7-401a-8a7d-4dda4b221e93\" (UID: \"54e6ab65-a6d7-401a-8a7d-4dda4b221e93\") " Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.244733 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-utilities\") pod \"54e6ab65-a6d7-401a-8a7d-4dda4b221e93\" (UID: \"54e6ab65-a6d7-401a-8a7d-4dda4b221e93\") " Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.244972 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-849xt\" (UniqueName: \"kubernetes.io/projected/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-kube-api-access-849xt\") pod \"54e6ab65-a6d7-401a-8a7d-4dda4b221e93\" (UID: \"54e6ab65-a6d7-401a-8a7d-4dda4b221e93\") " Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.245516 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-utilities" (OuterVolumeSpecName: "utilities") pod "54e6ab65-a6d7-401a-8a7d-4dda4b221e93" (UID: "54e6ab65-a6d7-401a-8a7d-4dda4b221e93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.245772 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.250968 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-kube-api-access-849xt" (OuterVolumeSpecName: "kube-api-access-849xt") pod "54e6ab65-a6d7-401a-8a7d-4dda4b221e93" (UID: "54e6ab65-a6d7-401a-8a7d-4dda4b221e93"). InnerVolumeSpecName "kube-api-access-849xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.297458 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54e6ab65-a6d7-401a-8a7d-4dda4b221e93" (UID: "54e6ab65-a6d7-401a-8a7d-4dda4b221e93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.347922 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-849xt\" (UniqueName: \"kubernetes.io/projected/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-kube-api-access-849xt\") on node \"crc\" DevicePath \"\"" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.347965 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54e6ab65-a6d7-401a-8a7d-4dda4b221e93-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.567075 4580 generic.go:334] "Generic (PLEG): container finished" podID="54e6ab65-a6d7-401a-8a7d-4dda4b221e93" containerID="4f37c7459784316eba20534af4eb4013b6ee4ac41b476c44b9bf032ffaf0754f" exitCode=0 Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.567122 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppdw6" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.567145 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppdw6" event={"ID":"54e6ab65-a6d7-401a-8a7d-4dda4b221e93","Type":"ContainerDied","Data":"4f37c7459784316eba20534af4eb4013b6ee4ac41b476c44b9bf032ffaf0754f"} Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.567980 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppdw6" event={"ID":"54e6ab65-a6d7-401a-8a7d-4dda4b221e93","Type":"ContainerDied","Data":"ebb0720f224b931101f5559f312e01e382bb84b99c5abd8988264598c4083bfd"} Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.568003 4580 scope.go:117] "RemoveContainer" containerID="4f37c7459784316eba20534af4eb4013b6ee4ac41b476c44b9bf032ffaf0754f" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.598318 4580 scope.go:117] "RemoveContainer" containerID="28ab1206fff69ecca6d1dda6c91898cc3d27461b2fe1652bfecd88bc62d8433d" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.606203 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ppdw6"] Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.619039 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ppdw6"] Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.624904 4580 scope.go:117] "RemoveContainer" containerID="90ff3c82e1d93e4a191e74388520053567d61aaf2857fc433a4c0eac99a441fa" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.662603 4580 scope.go:117] "RemoveContainer" containerID="4f37c7459784316eba20534af4eb4013b6ee4ac41b476c44b9bf032ffaf0754f" Mar 21 05:39:26 crc kubenswrapper[4580]: E0321 05:39:26.663015 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f37c7459784316eba20534af4eb4013b6ee4ac41b476c44b9bf032ffaf0754f\": container with ID starting with 4f37c7459784316eba20534af4eb4013b6ee4ac41b476c44b9bf032ffaf0754f not found: ID does not exist" containerID="4f37c7459784316eba20534af4eb4013b6ee4ac41b476c44b9bf032ffaf0754f" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.663051 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f37c7459784316eba20534af4eb4013b6ee4ac41b476c44b9bf032ffaf0754f"} err="failed to get container status \"4f37c7459784316eba20534af4eb4013b6ee4ac41b476c44b9bf032ffaf0754f\": rpc error: code = NotFound desc = could not find container \"4f37c7459784316eba20534af4eb4013b6ee4ac41b476c44b9bf032ffaf0754f\": container with ID starting with 4f37c7459784316eba20534af4eb4013b6ee4ac41b476c44b9bf032ffaf0754f not found: ID does not exist" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.663134 4580 scope.go:117] "RemoveContainer" containerID="28ab1206fff69ecca6d1dda6c91898cc3d27461b2fe1652bfecd88bc62d8433d" Mar 21 05:39:26 crc kubenswrapper[4580]: E0321 05:39:26.663372 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28ab1206fff69ecca6d1dda6c91898cc3d27461b2fe1652bfecd88bc62d8433d\": container with ID starting with 28ab1206fff69ecca6d1dda6c91898cc3d27461b2fe1652bfecd88bc62d8433d not found: ID does not exist" containerID="28ab1206fff69ecca6d1dda6c91898cc3d27461b2fe1652bfecd88bc62d8433d" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.663406 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28ab1206fff69ecca6d1dda6c91898cc3d27461b2fe1652bfecd88bc62d8433d"} err="failed to get container status \"28ab1206fff69ecca6d1dda6c91898cc3d27461b2fe1652bfecd88bc62d8433d\": rpc error: code = NotFound desc = could not find container \"28ab1206fff69ecca6d1dda6c91898cc3d27461b2fe1652bfecd88bc62d8433d\": container with ID starting with 28ab1206fff69ecca6d1dda6c91898cc3d27461b2fe1652bfecd88bc62d8433d not found: ID does not exist" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.663424 4580 scope.go:117] "RemoveContainer" containerID="90ff3c82e1d93e4a191e74388520053567d61aaf2857fc433a4c0eac99a441fa" Mar 21 05:39:26 crc kubenswrapper[4580]: E0321 05:39:26.663872 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90ff3c82e1d93e4a191e74388520053567d61aaf2857fc433a4c0eac99a441fa\": container with ID starting with 90ff3c82e1d93e4a191e74388520053567d61aaf2857fc433a4c0eac99a441fa not found: ID does not exist" containerID="90ff3c82e1d93e4a191e74388520053567d61aaf2857fc433a4c0eac99a441fa" Mar 21 05:39:26 crc kubenswrapper[4580]: I0321 05:39:26.663900 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ff3c82e1d93e4a191e74388520053567d61aaf2857fc433a4c0eac99a441fa"} err="failed to get container status \"90ff3c82e1d93e4a191e74388520053567d61aaf2857fc433a4c0eac99a441fa\": rpc error: code = NotFound desc = could not find container \"90ff3c82e1d93e4a191e74388520053567d61aaf2857fc433a4c0eac99a441fa\": container with ID starting with 90ff3c82e1d93e4a191e74388520053567d61aaf2857fc433a4c0eac99a441fa not found: ID does not exist" Mar 21 05:39:27 crc kubenswrapper[4580]: I0321 05:39:27.632063 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54e6ab65-a6d7-401a-8a7d-4dda4b221e93" path="/var/lib/kubelet/pods/54e6ab65-a6d7-401a-8a7d-4dda4b221e93/volumes" Mar 21 05:39:28 crc kubenswrapper[4580]: I0321 05:39:28.647156 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:28 crc kubenswrapper[4580]: I0321 05:39:28.647221 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:28 crc kubenswrapper[4580]: I0321 05:39:28.697324 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:29 crc kubenswrapper[4580]: I0321 05:39:29.657361 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:30 crc kubenswrapper[4580]: I0321 05:39:30.308084 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhf4n"] Mar 21 05:39:31 crc kubenswrapper[4580]: I0321 05:39:31.626252 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mhf4n" podUID="2931bd73-571a-44f4-9cd8-54ad9129c147" containerName="registry-server" containerID="cri-o://47db7335ed340f8b7135622bdd09ba4e63693a687e7de4887367cc0af8a8d14d" gracePeriod=2 Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.216445 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.386317 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2931bd73-571a-44f4-9cd8-54ad9129c147-catalog-content\") pod \"2931bd73-571a-44f4-9cd8-54ad9129c147\" (UID: \"2931bd73-571a-44f4-9cd8-54ad9129c147\") " Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.386581 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57z4r\" (UniqueName: \"kubernetes.io/projected/2931bd73-571a-44f4-9cd8-54ad9129c147-kube-api-access-57z4r\") pod \"2931bd73-571a-44f4-9cd8-54ad9129c147\" (UID: \"2931bd73-571a-44f4-9cd8-54ad9129c147\") " Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.386805 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2931bd73-571a-44f4-9cd8-54ad9129c147-utilities\") pod \"2931bd73-571a-44f4-9cd8-54ad9129c147\" (UID: \"2931bd73-571a-44f4-9cd8-54ad9129c147\") " Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.387807 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2931bd73-571a-44f4-9cd8-54ad9129c147-utilities" (OuterVolumeSpecName: "utilities") pod "2931bd73-571a-44f4-9cd8-54ad9129c147" (UID: "2931bd73-571a-44f4-9cd8-54ad9129c147"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.392535 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2931bd73-571a-44f4-9cd8-54ad9129c147-kube-api-access-57z4r" (OuterVolumeSpecName: "kube-api-access-57z4r") pod "2931bd73-571a-44f4-9cd8-54ad9129c147" (UID: "2931bd73-571a-44f4-9cd8-54ad9129c147"). InnerVolumeSpecName "kube-api-access-57z4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.489428 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57z4r\" (UniqueName: \"kubernetes.io/projected/2931bd73-571a-44f4-9cd8-54ad9129c147-kube-api-access-57z4r\") on node \"crc\" DevicePath \"\"" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.489474 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2931bd73-571a-44f4-9cd8-54ad9129c147-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.637567 4580 generic.go:334] "Generic (PLEG): container finished" podID="2931bd73-571a-44f4-9cd8-54ad9129c147" containerID="47db7335ed340f8b7135622bdd09ba4e63693a687e7de4887367cc0af8a8d14d" exitCode=0 Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.637620 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhf4n" event={"ID":"2931bd73-571a-44f4-9cd8-54ad9129c147","Type":"ContainerDied","Data":"47db7335ed340f8b7135622bdd09ba4e63693a687e7de4887367cc0af8a8d14d"} Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.637663 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhf4n" event={"ID":"2931bd73-571a-44f4-9cd8-54ad9129c147","Type":"ContainerDied","Data":"b98191809ed73b46bbc46c341b87bcc46cb9f6456c09bf993fb839aec7e20979"} Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.637680 4580 scope.go:117] "RemoveContainer" containerID="47db7335ed340f8b7135622bdd09ba4e63693a687e7de4887367cc0af8a8d14d" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.637638 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhf4n" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.658156 4580 scope.go:117] "RemoveContainer" containerID="f5ad72aa51de85ecc1222e14aa5acf9678e3aa285c4a6dfe1ae1954b7caaa399" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.676604 4580 scope.go:117] "RemoveContainer" containerID="0ab7dbb50d40e744ea2fc3fb4bb730415ca8fa395458f6936b08d859803cb6cf" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.728592 4580 scope.go:117] "RemoveContainer" containerID="47db7335ed340f8b7135622bdd09ba4e63693a687e7de4887367cc0af8a8d14d" Mar 21 05:39:32 crc kubenswrapper[4580]: E0321 05:39:32.729046 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47db7335ed340f8b7135622bdd09ba4e63693a687e7de4887367cc0af8a8d14d\": container with ID starting with 47db7335ed340f8b7135622bdd09ba4e63693a687e7de4887367cc0af8a8d14d not found: ID does not exist" containerID="47db7335ed340f8b7135622bdd09ba4e63693a687e7de4887367cc0af8a8d14d" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.729076 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47db7335ed340f8b7135622bdd09ba4e63693a687e7de4887367cc0af8a8d14d"} err="failed to get container status \"47db7335ed340f8b7135622bdd09ba4e63693a687e7de4887367cc0af8a8d14d\": rpc error: code = NotFound desc = could not find container \"47db7335ed340f8b7135622bdd09ba4e63693a687e7de4887367cc0af8a8d14d\": container with ID starting with 47db7335ed340f8b7135622bdd09ba4e63693a687e7de4887367cc0af8a8d14d not found: ID does not exist" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.729099 4580 scope.go:117] "RemoveContainer" containerID="f5ad72aa51de85ecc1222e14aa5acf9678e3aa285c4a6dfe1ae1954b7caaa399" Mar 21 05:39:32 crc kubenswrapper[4580]: E0321 05:39:32.729415 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ad72aa51de85ecc1222e14aa5acf9678e3aa285c4a6dfe1ae1954b7caaa399\": container with ID starting with f5ad72aa51de85ecc1222e14aa5acf9678e3aa285c4a6dfe1ae1954b7caaa399 not found: ID does not exist" containerID="f5ad72aa51de85ecc1222e14aa5acf9678e3aa285c4a6dfe1ae1954b7caaa399" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.729434 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ad72aa51de85ecc1222e14aa5acf9678e3aa285c4a6dfe1ae1954b7caaa399"} err="failed to get container status \"f5ad72aa51de85ecc1222e14aa5acf9678e3aa285c4a6dfe1ae1954b7caaa399\": rpc error: code = NotFound desc = could not find container \"f5ad72aa51de85ecc1222e14aa5acf9678e3aa285c4a6dfe1ae1954b7caaa399\": container with ID starting with f5ad72aa51de85ecc1222e14aa5acf9678e3aa285c4a6dfe1ae1954b7caaa399 not found: ID does not exist" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.729451 4580 scope.go:117] "RemoveContainer" containerID="0ab7dbb50d40e744ea2fc3fb4bb730415ca8fa395458f6936b08d859803cb6cf" Mar 21 05:39:32 crc kubenswrapper[4580]: E0321 05:39:32.729931 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab7dbb50d40e744ea2fc3fb4bb730415ca8fa395458f6936b08d859803cb6cf\": container with ID starting with 0ab7dbb50d40e744ea2fc3fb4bb730415ca8fa395458f6936b08d859803cb6cf not found: ID does not exist" containerID="0ab7dbb50d40e744ea2fc3fb4bb730415ca8fa395458f6936b08d859803cb6cf" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.729983 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab7dbb50d40e744ea2fc3fb4bb730415ca8fa395458f6936b08d859803cb6cf"} err="failed to get container status \"0ab7dbb50d40e744ea2fc3fb4bb730415ca8fa395458f6936b08d859803cb6cf\": rpc error: code = NotFound desc = could not find container \"0ab7dbb50d40e744ea2fc3fb4bb730415ca8fa395458f6936b08d859803cb6cf\": container with ID starting with 0ab7dbb50d40e744ea2fc3fb4bb730415ca8fa395458f6936b08d859803cb6cf not found: ID does not exist" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.833939 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2931bd73-571a-44f4-9cd8-54ad9129c147-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2931bd73-571a-44f4-9cd8-54ad9129c147" (UID: "2931bd73-571a-44f4-9cd8-54ad9129c147"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.897604 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2931bd73-571a-44f4-9cd8-54ad9129c147-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.979941 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhf4n"] Mar 21 05:39:32 crc kubenswrapper[4580]: I0321 05:39:32.987907 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mhf4n"] Mar 21 05:39:33 crc kubenswrapper[4580]: I0321 05:39:33.627558 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2931bd73-571a-44f4-9cd8-54ad9129c147" path="/var/lib/kubelet/pods/2931bd73-571a-44f4-9cd8-54ad9129c147/volumes" Mar 21 05:39:51 crc kubenswrapper[4580]: I0321 05:39:51.943473 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7rtth"] Mar 21 05:39:51 crc kubenswrapper[4580]: E0321 05:39:51.944383 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2931bd73-571a-44f4-9cd8-54ad9129c147" containerName="registry-server" Mar 21 05:39:51 crc kubenswrapper[4580]: I0321 05:39:51.944397 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2931bd73-571a-44f4-9cd8-54ad9129c147" containerName="registry-server" Mar 21 05:39:51 crc kubenswrapper[4580]: E0321 05:39:51.944412 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2931bd73-571a-44f4-9cd8-54ad9129c147" containerName="extract-utilities" Mar 21 05:39:51 crc kubenswrapper[4580]: I0321 05:39:51.944418 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2931bd73-571a-44f4-9cd8-54ad9129c147" containerName="extract-utilities" Mar 21 05:39:51 crc kubenswrapper[4580]: E0321 05:39:51.944430 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e6ab65-a6d7-401a-8a7d-4dda4b221e93" containerName="extract-content" Mar 21 05:39:51 crc kubenswrapper[4580]: I0321 05:39:51.944436 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e6ab65-a6d7-401a-8a7d-4dda4b221e93" containerName="extract-content" Mar 21 05:39:51 crc kubenswrapper[4580]: E0321 05:39:51.944447 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e6ab65-a6d7-401a-8a7d-4dda4b221e93" containerName="extract-utilities" Mar 21 05:39:51 crc kubenswrapper[4580]: I0321 05:39:51.944454 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e6ab65-a6d7-401a-8a7d-4dda4b221e93" containerName="extract-utilities" Mar 21 05:39:51 crc kubenswrapper[4580]: E0321 05:39:51.944465 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2931bd73-571a-44f4-9cd8-54ad9129c147" containerName="extract-content" Mar 21 05:39:51 crc kubenswrapper[4580]: I0321 05:39:51.944470 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2931bd73-571a-44f4-9cd8-54ad9129c147" containerName="extract-content" Mar 21 05:39:51 crc kubenswrapper[4580]: E0321 05:39:51.944501 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e6ab65-a6d7-401a-8a7d-4dda4b221e93" containerName="registry-server" Mar 21 05:39:51 crc kubenswrapper[4580]: I0321 05:39:51.944507 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e6ab65-a6d7-401a-8a7d-4dda4b221e93" containerName="registry-server" Mar 21 05:39:51 crc kubenswrapper[4580]: I0321 05:39:51.944665 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e6ab65-a6d7-401a-8a7d-4dda4b221e93" containerName="registry-server" Mar 21 05:39:51 crc kubenswrapper[4580]: I0321 05:39:51.944675 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2931bd73-571a-44f4-9cd8-54ad9129c147" containerName="registry-server" Mar 21 05:39:51 crc kubenswrapper[4580]: I0321 05:39:51.945983 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:39:51 crc kubenswrapper[4580]: I0321 05:39:51.961005 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rtth"] Mar 21 05:39:52 crc kubenswrapper[4580]: I0321 05:39:52.059193 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k62hc\" (UniqueName: \"kubernetes.io/projected/3114ab35-00fa-474c-a746-a7333f2ccb3a-kube-api-access-k62hc\") pod \"redhat-marketplace-7rtth\" (UID: \"3114ab35-00fa-474c-a746-a7333f2ccb3a\") " pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:39:52 crc kubenswrapper[4580]: I0321 05:39:52.059360 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3114ab35-00fa-474c-a746-a7333f2ccb3a-catalog-content\") pod \"redhat-marketplace-7rtth\" (UID: \"3114ab35-00fa-474c-a746-a7333f2ccb3a\") " pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:39:52 crc kubenswrapper[4580]: I0321 05:39:52.059402 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3114ab35-00fa-474c-a746-a7333f2ccb3a-utilities\") pod \"redhat-marketplace-7rtth\" (UID: \"3114ab35-00fa-474c-a746-a7333f2ccb3a\") " pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:39:52 crc kubenswrapper[4580]: I0321 05:39:52.160945 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3114ab35-00fa-474c-a746-a7333f2ccb3a-catalog-content\") pod \"redhat-marketplace-7rtth\" (UID: \"3114ab35-00fa-474c-a746-a7333f2ccb3a\") " pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:39:52 crc kubenswrapper[4580]: I0321 05:39:52.161012 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3114ab35-00fa-474c-a746-a7333f2ccb3a-utilities\") pod \"redhat-marketplace-7rtth\" (UID: \"3114ab35-00fa-474c-a746-a7333f2ccb3a\") " pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:39:52 crc kubenswrapper[4580]: I0321 05:39:52.161063 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k62hc\" (UniqueName: \"kubernetes.io/projected/3114ab35-00fa-474c-a746-a7333f2ccb3a-kube-api-access-k62hc\") pod \"redhat-marketplace-7rtth\" (UID: \"3114ab35-00fa-474c-a746-a7333f2ccb3a\") " pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:39:52 crc kubenswrapper[4580]: I0321 05:39:52.161701 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3114ab35-00fa-474c-a746-a7333f2ccb3a-catalog-content\") pod \"redhat-marketplace-7rtth\" (UID: \"3114ab35-00fa-474c-a746-a7333f2ccb3a\") " pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:39:52 crc kubenswrapper[4580]: I0321 05:39:52.161727 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3114ab35-00fa-474c-a746-a7333f2ccb3a-utilities\") pod \"redhat-marketplace-7rtth\" (UID: \"3114ab35-00fa-474c-a746-a7333f2ccb3a\") " pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:39:52 crc kubenswrapper[4580]: I0321 05:39:52.193013 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k62hc\" (UniqueName: \"kubernetes.io/projected/3114ab35-00fa-474c-a746-a7333f2ccb3a-kube-api-access-k62hc\") pod \"redhat-marketplace-7rtth\" (UID: \"3114ab35-00fa-474c-a746-a7333f2ccb3a\") " pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:39:52 crc kubenswrapper[4580]: I0321 05:39:52.281544 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:39:52 crc kubenswrapper[4580]: I0321 05:39:52.735526 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rtth"] Mar 21 05:39:52 crc kubenswrapper[4580]: I0321 05:39:52.810191 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rtth" event={"ID":"3114ab35-00fa-474c-a746-a7333f2ccb3a","Type":"ContainerStarted","Data":"6007d56263bd5d94a55350cf8596285e4f823deae53847b6051dbace8c3760a6"} Mar 21 05:39:53 crc kubenswrapper[4580]: I0321 05:39:53.827417 4580 generic.go:334] "Generic (PLEG): container finished" podID="3114ab35-00fa-474c-a746-a7333f2ccb3a" containerID="af30a83be315a42aa943bb3243695e7b06a855ab25e1c2c63bd82eb010dcff16" exitCode=0 Mar 21 05:39:53 crc kubenswrapper[4580]: I0321 05:39:53.827630 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rtth" event={"ID":"3114ab35-00fa-474c-a746-a7333f2ccb3a","Type":"ContainerDied","Data":"af30a83be315a42aa943bb3243695e7b06a855ab25e1c2c63bd82eb010dcff16"} Mar 21 05:39:53 crc kubenswrapper[4580]: I0321 05:39:53.832993 4580 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:39:54 crc kubenswrapper[4580]: I0321 05:39:54.837973 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rtth" event={"ID":"3114ab35-00fa-474c-a746-a7333f2ccb3a","Type":"ContainerStarted","Data":"78edb7aea5a231fd4c4f0e4b432ff411b56c51dea1ef19dbd68d06622b2027d0"} Mar 21 05:39:55 crc kubenswrapper[4580]: I0321 05:39:55.849493 4580 generic.go:334] "Generic (PLEG): container finished" podID="3114ab35-00fa-474c-a746-a7333f2ccb3a" containerID="78edb7aea5a231fd4c4f0e4b432ff411b56c51dea1ef19dbd68d06622b2027d0" exitCode=0 Mar 21 05:39:55 crc kubenswrapper[4580]: I0321 05:39:55.849792 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rtth" event={"ID":"3114ab35-00fa-474c-a746-a7333f2ccb3a","Type":"ContainerDied","Data":"78edb7aea5a231fd4c4f0e4b432ff411b56c51dea1ef19dbd68d06622b2027d0"} Mar 21 05:39:56 crc kubenswrapper[4580]: I0321 05:39:56.863997 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rtth" event={"ID":"3114ab35-00fa-474c-a746-a7333f2ccb3a","Type":"ContainerStarted","Data":"3f4c07d08a86c857dbe6d993921f6278b12a8b0879c92736fa44ae0517bd3f1e"} Mar 21 05:39:56 crc kubenswrapper[4580]: I0321 05:39:56.891078 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7rtth" podStartSLOduration=3.241896818 podStartE2EDuration="5.891040056s" podCreationTimestamp="2026-03-21 05:39:51 +0000 UTC" firstStartedPulling="2026-03-21 05:39:53.832689613 +0000 UTC m=+2898.915273241" lastFinishedPulling="2026-03-21 05:39:56.481832851 +0000 UTC m=+2901.564416479" observedRunningTime="2026-03-21 05:39:56.881444677 +0000 UTC m=+2901.964028305" watchObservedRunningTime="2026-03-21 05:39:56.891040056 +0000 UTC m=+2901.973623684" Mar 21 05:40:00 crc kubenswrapper[4580]: I0321 05:40:00.142551 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567860-k7fsb"] Mar 21 05:40:00 crc kubenswrapper[4580]: I0321 05:40:00.144323 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-k7fsb" Mar 21 05:40:00 crc kubenswrapper[4580]: I0321 05:40:00.152752 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:40:00 crc kubenswrapper[4580]: I0321 05:40:00.152929 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:40:00 crc kubenswrapper[4580]: I0321 05:40:00.153425 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:40:00 crc kubenswrapper[4580]: I0321 05:40:00.154490 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567860-k7fsb"] Mar 21 05:40:00 crc kubenswrapper[4580]: I0321 05:40:00.224599 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw9fb\" (UniqueName: \"kubernetes.io/projected/c276b9ba-801d-47b2-b1c9-2e31f66fb4e7-kube-api-access-cw9fb\") pod \"auto-csr-approver-29567860-k7fsb\" (UID: \"c276b9ba-801d-47b2-b1c9-2e31f66fb4e7\") " pod="openshift-infra/auto-csr-approver-29567860-k7fsb" Mar 21 05:40:00 crc kubenswrapper[4580]: I0321 05:40:00.326907 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw9fb\" (UniqueName: \"kubernetes.io/projected/c276b9ba-801d-47b2-b1c9-2e31f66fb4e7-kube-api-access-cw9fb\") pod \"auto-csr-approver-29567860-k7fsb\" (UID: \"c276b9ba-801d-47b2-b1c9-2e31f66fb4e7\") " pod="openshift-infra/auto-csr-approver-29567860-k7fsb" Mar 21 05:40:00 crc kubenswrapper[4580]: I0321 05:40:00.347354 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw9fb\" (UniqueName: \"kubernetes.io/projected/c276b9ba-801d-47b2-b1c9-2e31f66fb4e7-kube-api-access-cw9fb\") pod \"auto-csr-approver-29567860-k7fsb\" (UID: \"c276b9ba-801d-47b2-b1c9-2e31f66fb4e7\") " pod="openshift-infra/auto-csr-approver-29567860-k7fsb" Mar 21 05:40:00 crc kubenswrapper[4580]: I0321 05:40:00.463645 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-k7fsb" Mar 21 05:40:00 crc kubenswrapper[4580]: I0321 05:40:00.932249 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567860-k7fsb"] Mar 21 05:40:01 crc kubenswrapper[4580]: I0321 05:40:01.907585 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567860-k7fsb" event={"ID":"c276b9ba-801d-47b2-b1c9-2e31f66fb4e7","Type":"ContainerStarted","Data":"f3e3f7ad53742108983143b2e1a82b8ba8cf3ac0ad34226db255e57b69a12720"} Mar 21 05:40:02 crc kubenswrapper[4580]: I0321 05:40:02.282071 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:40:02 crc kubenswrapper[4580]: I0321 05:40:02.282587 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:40:02 crc kubenswrapper[4580]: I0321 05:40:02.330549 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:40:02 crc kubenswrapper[4580]: I0321 05:40:02.917487 4580 generic.go:334] "Generic (PLEG): container finished" podID="c276b9ba-801d-47b2-b1c9-2e31f66fb4e7" containerID="c1e460ca0be6a9730b568009b7fdf335e61c92ec23e98af46ec287e3a242163f" exitCode=0 Mar 21 05:40:02 crc kubenswrapper[4580]: I0321 05:40:02.917533 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567860-k7fsb" event={"ID":"c276b9ba-801d-47b2-b1c9-2e31f66fb4e7","Type":"ContainerDied","Data":"c1e460ca0be6a9730b568009b7fdf335e61c92ec23e98af46ec287e3a242163f"} Mar 21 05:40:02 crc kubenswrapper[4580]: I0321 05:40:02.969293 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:40:03 crc kubenswrapper[4580]: I0321 05:40:03.012984 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rtth"] Mar 21 05:40:04 crc kubenswrapper[4580]: I0321 05:40:04.264100 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-k7fsb" Mar 21 05:40:04 crc kubenswrapper[4580]: I0321 05:40:04.409511 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw9fb\" (UniqueName: \"kubernetes.io/projected/c276b9ba-801d-47b2-b1c9-2e31f66fb4e7-kube-api-access-cw9fb\") pod \"c276b9ba-801d-47b2-b1c9-2e31f66fb4e7\" (UID: \"c276b9ba-801d-47b2-b1c9-2e31f66fb4e7\") " Mar 21 05:40:04 crc kubenswrapper[4580]: I0321 05:40:04.415530 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c276b9ba-801d-47b2-b1c9-2e31f66fb4e7-kube-api-access-cw9fb" (OuterVolumeSpecName: "kube-api-access-cw9fb") pod "c276b9ba-801d-47b2-b1c9-2e31f66fb4e7" (UID: "c276b9ba-801d-47b2-b1c9-2e31f66fb4e7"). InnerVolumeSpecName "kube-api-access-cw9fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:40:04 crc kubenswrapper[4580]: I0321 05:40:04.513017 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw9fb\" (UniqueName: \"kubernetes.io/projected/c276b9ba-801d-47b2-b1c9-2e31f66fb4e7-kube-api-access-cw9fb\") on node \"crc\" DevicePath \"\"" Mar 21 05:40:04 crc kubenswrapper[4580]: I0321 05:40:04.934198 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-k7fsb" Mar 21 05:40:04 crc kubenswrapper[4580]: I0321 05:40:04.934235 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567860-k7fsb" event={"ID":"c276b9ba-801d-47b2-b1c9-2e31f66fb4e7","Type":"ContainerDied","Data":"f3e3f7ad53742108983143b2e1a82b8ba8cf3ac0ad34226db255e57b69a12720"} Mar 21 05:40:04 crc kubenswrapper[4580]: I0321 05:40:04.934291 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3e3f7ad53742108983143b2e1a82b8ba8cf3ac0ad34226db255e57b69a12720" Mar 21 05:40:04 crc kubenswrapper[4580]: I0321 05:40:04.934337 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7rtth" podUID="3114ab35-00fa-474c-a746-a7333f2ccb3a" containerName="registry-server" containerID="cri-o://3f4c07d08a86c857dbe6d993921f6278b12a8b0879c92736fa44ae0517bd3f1e" gracePeriod=2 Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.350597 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-2v6sq"] Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.358052 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-2v6sq"] Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.375768 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.529331 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3114ab35-00fa-474c-a746-a7333f2ccb3a-catalog-content\") pod \"3114ab35-00fa-474c-a746-a7333f2ccb3a\" (UID: \"3114ab35-00fa-474c-a746-a7333f2ccb3a\") " Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.529463 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k62hc\" (UniqueName: \"kubernetes.io/projected/3114ab35-00fa-474c-a746-a7333f2ccb3a-kube-api-access-k62hc\") pod \"3114ab35-00fa-474c-a746-a7333f2ccb3a\" (UID: \"3114ab35-00fa-474c-a746-a7333f2ccb3a\") " Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.529494 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3114ab35-00fa-474c-a746-a7333f2ccb3a-utilities\") pod \"3114ab35-00fa-474c-a746-a7333f2ccb3a\" (UID: \"3114ab35-00fa-474c-a746-a7333f2ccb3a\") " Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.530771 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3114ab35-00fa-474c-a746-a7333f2ccb3a-utilities" (OuterVolumeSpecName: "utilities") pod "3114ab35-00fa-474c-a746-a7333f2ccb3a" (UID: "3114ab35-00fa-474c-a746-a7333f2ccb3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.542349 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3114ab35-00fa-474c-a746-a7333f2ccb3a-kube-api-access-k62hc" (OuterVolumeSpecName: "kube-api-access-k62hc") pod "3114ab35-00fa-474c-a746-a7333f2ccb3a" (UID: "3114ab35-00fa-474c-a746-a7333f2ccb3a"). InnerVolumeSpecName "kube-api-access-k62hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.565757 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3114ab35-00fa-474c-a746-a7333f2ccb3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3114ab35-00fa-474c-a746-a7333f2ccb3a" (UID: "3114ab35-00fa-474c-a746-a7333f2ccb3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.629827 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77fa1559-ac28-4b0d-a925-97d89f945299" path="/var/lib/kubelet/pods/77fa1559-ac28-4b0d-a925-97d89f945299/volumes" Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.631656 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k62hc\" (UniqueName: \"kubernetes.io/projected/3114ab35-00fa-474c-a746-a7333f2ccb3a-kube-api-access-k62hc\") on node \"crc\" DevicePath \"\"" Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.631688 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3114ab35-00fa-474c-a746-a7333f2ccb3a-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.631702 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3114ab35-00fa-474c-a746-a7333f2ccb3a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.945583 4580 generic.go:334] "Generic (PLEG): container finished" podID="3114ab35-00fa-474c-a746-a7333f2ccb3a" containerID="3f4c07d08a86c857dbe6d993921f6278b12a8b0879c92736fa44ae0517bd3f1e" exitCode=0 Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.945636 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rtth" event={"ID":"3114ab35-00fa-474c-a746-a7333f2ccb3a","Type":"ContainerDied","Data":"3f4c07d08a86c857dbe6d993921f6278b12a8b0879c92736fa44ae0517bd3f1e"} Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.945665 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rtth" event={"ID":"3114ab35-00fa-474c-a746-a7333f2ccb3a","Type":"ContainerDied","Data":"6007d56263bd5d94a55350cf8596285e4f823deae53847b6051dbace8c3760a6"} Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.945684 4580 scope.go:117] "RemoveContainer" containerID="3f4c07d08a86c857dbe6d993921f6278b12a8b0879c92736fa44ae0517bd3f1e" Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.945760 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rtth" Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.984753 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rtth"] Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.988812 4580 scope.go:117] "RemoveContainer" containerID="78edb7aea5a231fd4c4f0e4b432ff411b56c51dea1ef19dbd68d06622b2027d0" Mar 21 05:40:05 crc kubenswrapper[4580]: I0321 05:40:05.994636 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rtth"] Mar 21 05:40:06 crc kubenswrapper[4580]: I0321 05:40:06.008130 4580 scope.go:117] "RemoveContainer" containerID="af30a83be315a42aa943bb3243695e7b06a855ab25e1c2c63bd82eb010dcff16" Mar 21 05:40:06 crc kubenswrapper[4580]: I0321 05:40:06.070420 4580 scope.go:117] "RemoveContainer" containerID="3f4c07d08a86c857dbe6d993921f6278b12a8b0879c92736fa44ae0517bd3f1e" Mar 21 05:40:06 crc kubenswrapper[4580]: E0321 05:40:06.071663 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4c07d08a86c857dbe6d993921f6278b12a8b0879c92736fa44ae0517bd3f1e\": container with ID starting with 3f4c07d08a86c857dbe6d993921f6278b12a8b0879c92736fa44ae0517bd3f1e not found: ID does not exist" containerID="3f4c07d08a86c857dbe6d993921f6278b12a8b0879c92736fa44ae0517bd3f1e" Mar 21 05:40:06 crc kubenswrapper[4580]: I0321 05:40:06.071701 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4c07d08a86c857dbe6d993921f6278b12a8b0879c92736fa44ae0517bd3f1e"} err="failed to get container status \"3f4c07d08a86c857dbe6d993921f6278b12a8b0879c92736fa44ae0517bd3f1e\": rpc error: code = NotFound desc = could not find container \"3f4c07d08a86c857dbe6d993921f6278b12a8b0879c92736fa44ae0517bd3f1e\": container with ID starting with 3f4c07d08a86c857dbe6d993921f6278b12a8b0879c92736fa44ae0517bd3f1e not found: ID does not exist" Mar 21 05:40:06 crc kubenswrapper[4580]: I0321 05:40:06.071726 4580 scope.go:117] "RemoveContainer" containerID="78edb7aea5a231fd4c4f0e4b432ff411b56c51dea1ef19dbd68d06622b2027d0" Mar 21 05:40:06 crc kubenswrapper[4580]: E0321 05:40:06.072081 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78edb7aea5a231fd4c4f0e4b432ff411b56c51dea1ef19dbd68d06622b2027d0\": container with ID starting with 78edb7aea5a231fd4c4f0e4b432ff411b56c51dea1ef19dbd68d06622b2027d0 not found: ID does not exist" containerID="78edb7aea5a231fd4c4f0e4b432ff411b56c51dea1ef19dbd68d06622b2027d0" Mar 21 05:40:06 crc kubenswrapper[4580]: I0321 05:40:06.072123 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78edb7aea5a231fd4c4f0e4b432ff411b56c51dea1ef19dbd68d06622b2027d0"} err="failed to get container status \"78edb7aea5a231fd4c4f0e4b432ff411b56c51dea1ef19dbd68d06622b2027d0\": rpc error: code = NotFound desc = could not find container \"78edb7aea5a231fd4c4f0e4b432ff411b56c51dea1ef19dbd68d06622b2027d0\": container with ID starting with 78edb7aea5a231fd4c4f0e4b432ff411b56c51dea1ef19dbd68d06622b2027d0 not found: ID does not exist" Mar 21 05:40:06 crc kubenswrapper[4580]: I0321 05:40:06.072155 4580 scope.go:117] "RemoveContainer" containerID="af30a83be315a42aa943bb3243695e7b06a855ab25e1c2c63bd82eb010dcff16" Mar 21 05:40:06 crc kubenswrapper[4580]: E0321 05:40:06.072396 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af30a83be315a42aa943bb3243695e7b06a855ab25e1c2c63bd82eb010dcff16\": container with ID starting with af30a83be315a42aa943bb3243695e7b06a855ab25e1c2c63bd82eb010dcff16 not found: ID does not exist" containerID="af30a83be315a42aa943bb3243695e7b06a855ab25e1c2c63bd82eb010dcff16" Mar 21 05:40:06 crc kubenswrapper[4580]: I0321 05:40:06.072434 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af30a83be315a42aa943bb3243695e7b06a855ab25e1c2c63bd82eb010dcff16"} err="failed to get container status \"af30a83be315a42aa943bb3243695e7b06a855ab25e1c2c63bd82eb010dcff16\": rpc error: code = NotFound desc = could not find container \"af30a83be315a42aa943bb3243695e7b06a855ab25e1c2c63bd82eb010dcff16\": container with ID starting with af30a83be315a42aa943bb3243695e7b06a855ab25e1c2c63bd82eb010dcff16 not found: ID does not exist" Mar 21 05:40:07 crc kubenswrapper[4580]: I0321 05:40:07.628593 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3114ab35-00fa-474c-a746-a7333f2ccb3a" path="/var/lib/kubelet/pods/3114ab35-00fa-474c-a746-a7333f2ccb3a/volumes" Mar 21 05:40:14 crc kubenswrapper[4580]: I0321 05:40:14.671863 4580 scope.go:117] "RemoveContainer" containerID="d35ced91df9bd8ebef28543dc4af9533487bbd288ae2357f3209ab10d4803691" Mar 21 05:40:36 crc kubenswrapper[4580]: I0321 05:40:36.272997 4580 generic.go:334] "Generic (PLEG): container finished" podID="53375dd9-0a2b-413f-8fa2-1ebd8d63df42" containerID="86278405e1a31d52776aa2b7591dcc4b472c70052f233fffddca85f3005b4194" exitCode=0 Mar 21 05:40:36 crc kubenswrapper[4580]: I0321 05:40:36.273483 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" event={"ID":"53375dd9-0a2b-413f-8fa2-1ebd8d63df42","Type":"ContainerDied","Data":"86278405e1a31d52776aa2b7591dcc4b472c70052f233fffddca85f3005b4194"} Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.664541 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.743111 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-inventory\") pod \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.743226 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-1\") pod \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.743350 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-telemetry-combined-ca-bundle\") pod \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.743393 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4fgk\" (UniqueName: \"kubernetes.io/projected/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-kube-api-access-c4fgk\") pod \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.743441 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-0\") pod \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.743538 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ssh-key-openstack-edpm-ipam\") pod \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.743614 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-2\") pod \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\" (UID: \"53375dd9-0a2b-413f-8fa2-1ebd8d63df42\") " Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.755475 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "53375dd9-0a2b-413f-8fa2-1ebd8d63df42" (UID: "53375dd9-0a2b-413f-8fa2-1ebd8d63df42"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.763741 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-kube-api-access-c4fgk" (OuterVolumeSpecName: "kube-api-access-c4fgk") pod "53375dd9-0a2b-413f-8fa2-1ebd8d63df42" (UID: "53375dd9-0a2b-413f-8fa2-1ebd8d63df42"). InnerVolumeSpecName "kube-api-access-c4fgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.780220 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "53375dd9-0a2b-413f-8fa2-1ebd8d63df42" (UID: "53375dd9-0a2b-413f-8fa2-1ebd8d63df42"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.783130 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "53375dd9-0a2b-413f-8fa2-1ebd8d63df42" (UID: "53375dd9-0a2b-413f-8fa2-1ebd8d63df42"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.785357 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "53375dd9-0a2b-413f-8fa2-1ebd8d63df42" (UID: "53375dd9-0a2b-413f-8fa2-1ebd8d63df42"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.787915 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "53375dd9-0a2b-413f-8fa2-1ebd8d63df42" (UID: "53375dd9-0a2b-413f-8fa2-1ebd8d63df42"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.793691 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-inventory" (OuterVolumeSpecName: "inventory") pod "53375dd9-0a2b-413f-8fa2-1ebd8d63df42" (UID: "53375dd9-0a2b-413f-8fa2-1ebd8d63df42"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.847523 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.847595 4580 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.847616 4580 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.847636 4580 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.847656 4580 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.847670 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4fgk\" (UniqueName: \"kubernetes.io/projected/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-kube-api-access-c4fgk\") on node \"crc\" DevicePath \"\"" Mar 21 05:40:37 crc kubenswrapper[4580]: I0321 05:40:37.847682 4580 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/53375dd9-0a2b-413f-8fa2-1ebd8d63df42-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:40:38 crc kubenswrapper[4580]: I0321 05:40:38.292676 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" event={"ID":"53375dd9-0a2b-413f-8fa2-1ebd8d63df42","Type":"ContainerDied","Data":"93aee533e3e658cbe92b2a2bb8ce147d40f467e9dfef84be11e7d1fc58b908e7"} Mar 21 05:40:38 crc kubenswrapper[4580]: I0321 05:40:38.292728 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93aee533e3e658cbe92b2a2bb8ce147d40f467e9dfef84be11e7d1fc58b908e7" Mar 21 05:40:38 crc kubenswrapper[4580]: I0321 05:40:38.292769 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bc666" Mar 21 05:41:15 crc kubenswrapper[4580]: I0321 05:41:15.948252 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:41:15 crc kubenswrapper[4580]: I0321 05:41:15.948734 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.510334 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 21 05:41:40 crc kubenswrapper[4580]: E0321 05:41:40.512303 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3114ab35-00fa-474c-a746-a7333f2ccb3a" containerName="extract-content" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.512410 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3114ab35-00fa-474c-a746-a7333f2ccb3a" containerName="extract-content" Mar 21 05:41:40 crc kubenswrapper[4580]: E0321 05:41:40.512513 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53375dd9-0a2b-413f-8fa2-1ebd8d63df42" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.512579 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="53375dd9-0a2b-413f-8fa2-1ebd8d63df42" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 21 05:41:40 crc kubenswrapper[4580]: E0321 05:41:40.512649 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c276b9ba-801d-47b2-b1c9-2e31f66fb4e7" containerName="oc" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.512705 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c276b9ba-801d-47b2-b1c9-2e31f66fb4e7" containerName="oc" Mar 21 05:41:40 crc kubenswrapper[4580]: E0321 05:41:40.513265 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3114ab35-00fa-474c-a746-a7333f2ccb3a" containerName="registry-server" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.513362 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3114ab35-00fa-474c-a746-a7333f2ccb3a" containerName="registry-server" Mar 21 05:41:40 crc kubenswrapper[4580]: E0321 05:41:40.513428 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3114ab35-00fa-474c-a746-a7333f2ccb3a" containerName="extract-utilities" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.513481 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3114ab35-00fa-474c-a746-a7333f2ccb3a" containerName="extract-utilities" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.513738 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="53375dd9-0a2b-413f-8fa2-1ebd8d63df42" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.513835 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="c276b9ba-801d-47b2-b1c9-2e31f66fb4e7" containerName="oc" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.513943 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="3114ab35-00fa-474c-a746-a7333f2ccb3a" containerName="registry-server" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.514683 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.517139 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.517139 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.518242 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.518462 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-khfnw" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.542420 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.633327 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.633408 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c692d589-bfb1-449b-91ff-8517954bc204-config-data\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.633570 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c692d589-bfb1-449b-91ff-8517954bc204-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.633618 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btms6\" (UniqueName: \"kubernetes.io/projected/c692d589-bfb1-449b-91ff-8517954bc204-kube-api-access-btms6\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.633774 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c692d589-bfb1-449b-91ff-8517954bc204-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.633999 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c692d589-bfb1-449b-91ff-8517954bc204-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.634088 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.634377 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.634528 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.737021 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.737123 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.737205 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c692d589-bfb1-449b-91ff-8517954bc204-config-data\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.737248 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c692d589-bfb1-449b-91ff-8517954bc204-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.737273 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btms6\" (UniqueName: \"kubernetes.io/projected/c692d589-bfb1-449b-91ff-8517954bc204-kube-api-access-btms6\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.737354 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c692d589-bfb1-449b-91ff-8517954bc204-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.737384 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c692d589-bfb1-449b-91ff-8517954bc204-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.737418 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.737500 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.739098 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.739220 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c692d589-bfb1-449b-91ff-8517954bc204-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.739519 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c692d589-bfb1-449b-91ff-8517954bc204-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.739584 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c692d589-bfb1-449b-91ff-8517954bc204-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.739987 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c692d589-bfb1-449b-91ff-8517954bc204-config-data\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.745543 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.750713 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.753612 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.763197 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btms6\" (UniqueName: \"kubernetes.io/projected/c692d589-bfb1-449b-91ff-8517954bc204-kube-api-access-btms6\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.767774 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " pod="openstack/tempest-tests-tempest" Mar 21 05:41:40 crc kubenswrapper[4580]: I0321 05:41:40.833336 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:41:41 crc kubenswrapper[4580]: I0321 05:41:41.293471 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 21 05:41:41 crc kubenswrapper[4580]: I0321 05:41:41.826720 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c692d589-bfb1-449b-91ff-8517954bc204","Type":"ContainerStarted","Data":"62a20f66967e1805bb8e21546249cf71c6205b0869ebe96e21a378c4c4b1b0c8"} Mar 21 05:41:45 crc kubenswrapper[4580]: I0321 05:41:45.947309 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:41:45 crc kubenswrapper[4580]: I0321 05:41:45.947844 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:42:00 crc kubenswrapper[4580]: I0321 05:42:00.164625 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567862-rwqbg"] Mar 21 05:42:00 crc kubenswrapper[4580]: I0321 05:42:00.166802 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-rwqbg" Mar 21 05:42:00 crc kubenswrapper[4580]: I0321 05:42:00.169985 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:42:00 crc kubenswrapper[4580]: I0321 05:42:00.170209 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:42:00 crc kubenswrapper[4580]: I0321 05:42:00.170445 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:42:00 crc kubenswrapper[4580]: I0321 05:42:00.196116 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567862-rwqbg"] Mar 21 05:42:00 crc kubenswrapper[4580]: I0321 05:42:00.361752 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsf6f\" (UniqueName: \"kubernetes.io/projected/e3e06aaf-09aa-47d0-823b-05c80face2d4-kube-api-access-dsf6f\") pod \"auto-csr-approver-29567862-rwqbg\" (UID: \"e3e06aaf-09aa-47d0-823b-05c80face2d4\") " pod="openshift-infra/auto-csr-approver-29567862-rwqbg" Mar 21 05:42:00 crc kubenswrapper[4580]: I0321 05:42:00.463883 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsf6f\" (UniqueName: \"kubernetes.io/projected/e3e06aaf-09aa-47d0-823b-05c80face2d4-kube-api-access-dsf6f\") pod \"auto-csr-approver-29567862-rwqbg\" (UID: \"e3e06aaf-09aa-47d0-823b-05c80face2d4\") " pod="openshift-infra/auto-csr-approver-29567862-rwqbg" Mar 21 05:42:00 crc kubenswrapper[4580]: I0321 05:42:00.501225 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsf6f\" (UniqueName: \"kubernetes.io/projected/e3e06aaf-09aa-47d0-823b-05c80face2d4-kube-api-access-dsf6f\") pod \"auto-csr-approver-29567862-rwqbg\" (UID: \"e3e06aaf-09aa-47d0-823b-05c80face2d4\") " pod="openshift-infra/auto-csr-approver-29567862-rwqbg" Mar 21 05:42:00 crc kubenswrapper[4580]: I0321 05:42:00.795457 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-rwqbg" Mar 21 05:42:13 crc kubenswrapper[4580]: I0321 05:42:13.959192 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="c77c9b9f-3e73-4cef-9e10-39bfef8357b5" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 21 05:42:15 crc kubenswrapper[4580]: I0321 05:42:15.947983 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:42:15 crc kubenswrapper[4580]: I0321 05:42:15.948295 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:42:15 crc kubenswrapper[4580]: I0321 05:42:15.948339 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 05:42:15 crc kubenswrapper[4580]: I0321 05:42:15.949079 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7da25fc56e97732a5d1594544cf7d6bd189dbc61b5ca3f85d33f7ac4406a856c"} pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:42:15 crc kubenswrapper[4580]: I0321 05:42:15.949153 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" containerID="cri-o://7da25fc56e97732a5d1594544cf7d6bd189dbc61b5ca3f85d33f7ac4406a856c" gracePeriod=600 Mar 21 05:42:16 crc kubenswrapper[4580]: I0321 05:42:16.177100 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerID="7da25fc56e97732a5d1594544cf7d6bd189dbc61b5ca3f85d33f7ac4406a856c" exitCode=0 Mar 21 05:42:16 crc kubenswrapper[4580]: I0321 05:42:16.177140 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerDied","Data":"7da25fc56e97732a5d1594544cf7d6bd189dbc61b5ca3f85d33f7ac4406a856c"} Mar 21 05:42:16 crc kubenswrapper[4580]: I0321 05:42:16.177172 4580 scope.go:117] "RemoveContainer" containerID="a736ef2a998a6bf2fb321cf7e230820e4e64780aea220f9ee7aafc489ef12ce7" Mar 21 05:42:22 crc kubenswrapper[4580]: E0321 05:42:22.517270 4580 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 21 05:42:22 crc kubenswrapper[4580]: E0321 05:42:22.525087 4580 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btms6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(c692d589-bfb1-449b-91ff-8517954bc204): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:42:22 crc kubenswrapper[4580]: E0321 05:42:22.526543 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="c692d589-bfb1-449b-91ff-8517954bc204" Mar 21 05:42:23 crc kubenswrapper[4580]: I0321 05:42:23.007537 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567862-rwqbg"] Mar 21 05:42:23 crc kubenswrapper[4580]: I0321 05:42:23.241602 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886"} Mar 21 05:42:23 crc kubenswrapper[4580]: I0321 05:42:23.243564 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567862-rwqbg" event={"ID":"e3e06aaf-09aa-47d0-823b-05c80face2d4","Type":"ContainerStarted","Data":"f7ce2c880efe8f3f9118689757bdfa377dc71e2c117555b836dc7f0d5b7f8dfb"} Mar 21 05:42:23 crc kubenswrapper[4580]: E0321 05:42:23.245150 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="c692d589-bfb1-449b-91ff-8517954bc204" Mar 21 05:42:25 crc kubenswrapper[4580]: I0321 05:42:25.260159 4580 generic.go:334] "Generic (PLEG): container finished" podID="e3e06aaf-09aa-47d0-823b-05c80face2d4" containerID="74a2cd779dbe0f77366a7563a48b4eec978d36acf992caba34164deef464e369" exitCode=0 Mar 21 05:42:25 crc kubenswrapper[4580]: I0321 05:42:25.260284 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567862-rwqbg" event={"ID":"e3e06aaf-09aa-47d0-823b-05c80face2d4","Type":"ContainerDied","Data":"74a2cd779dbe0f77366a7563a48b4eec978d36acf992caba34164deef464e369"} Mar 21 05:42:26 crc kubenswrapper[4580]: I0321 05:42:26.583050 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-rwqbg" Mar 21 05:42:26 crc kubenswrapper[4580]: I0321 05:42:26.671855 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsf6f\" (UniqueName: \"kubernetes.io/projected/e3e06aaf-09aa-47d0-823b-05c80face2d4-kube-api-access-dsf6f\") pod \"e3e06aaf-09aa-47d0-823b-05c80face2d4\" (UID: \"e3e06aaf-09aa-47d0-823b-05c80face2d4\") " Mar 21 05:42:26 crc kubenswrapper[4580]: I0321 05:42:26.680095 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e06aaf-09aa-47d0-823b-05c80face2d4-kube-api-access-dsf6f" (OuterVolumeSpecName: "kube-api-access-dsf6f") pod "e3e06aaf-09aa-47d0-823b-05c80face2d4" (UID: "e3e06aaf-09aa-47d0-823b-05c80face2d4"). InnerVolumeSpecName "kube-api-access-dsf6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:42:26 crc kubenswrapper[4580]: I0321 05:42:26.774202 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsf6f\" (UniqueName: \"kubernetes.io/projected/e3e06aaf-09aa-47d0-823b-05c80face2d4-kube-api-access-dsf6f\") on node \"crc\" DevicePath \"\"" Mar 21 05:42:27 crc kubenswrapper[4580]: I0321 05:42:27.281681 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567862-rwqbg" event={"ID":"e3e06aaf-09aa-47d0-823b-05c80face2d4","Type":"ContainerDied","Data":"f7ce2c880efe8f3f9118689757bdfa377dc71e2c117555b836dc7f0d5b7f8dfb"} Mar 21 05:42:27 crc kubenswrapper[4580]: I0321 05:42:27.281719 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7ce2c880efe8f3f9118689757bdfa377dc71e2c117555b836dc7f0d5b7f8dfb" Mar 21 05:42:27 crc kubenswrapper[4580]: I0321 05:42:27.281763 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-rwqbg" Mar 21 05:42:27 crc kubenswrapper[4580]: I0321 05:42:27.655587 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-6ckk9"] Mar 21 05:42:27 crc kubenswrapper[4580]: I0321 05:42:27.665186 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-6ckk9"] Mar 21 05:42:29 crc kubenswrapper[4580]: I0321 05:42:29.630489 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e851aeb4-208d-4952-8135-b3284af9d30f" path="/var/lib/kubelet/pods/e851aeb4-208d-4952-8135-b3284af9d30f/volumes" Mar 21 05:42:38 crc kubenswrapper[4580]: I0321 05:42:38.032489 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 21 05:42:40 crc kubenswrapper[4580]: I0321 05:42:40.399197 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c692d589-bfb1-449b-91ff-8517954bc204","Type":"ContainerStarted","Data":"c0079af9200fe479c0ce3e39ebc2881e7686e8c7feba857b69304e3faaf00a5e"} Mar 21 05:42:40 crc kubenswrapper[4580]: I0321 05:42:40.431461 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.692383988 podStartE2EDuration="1m1.431441922s" podCreationTimestamp="2026-03-21 05:41:39 +0000 UTC" firstStartedPulling="2026-03-21 05:41:41.290769481 +0000 UTC m=+3006.373353109" lastFinishedPulling="2026-03-21 05:42:38.029827405 +0000 UTC m=+3063.112411043" observedRunningTime="2026-03-21 05:42:40.420635251 +0000 UTC m=+3065.503218889" watchObservedRunningTime="2026-03-21 05:42:40.431441922 +0000 UTC m=+3065.514025560" Mar 21 05:43:14 crc kubenswrapper[4580]: I0321 05:43:14.847182 4580 scope.go:117] "RemoveContainer" containerID="785f4c50746739aaf71b2500d7173d011116f9f7d3c832e82f59cd838b1fd70a" Mar 21 05:44:00 crc kubenswrapper[4580]: I0321 05:44:00.149383 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567864-d4qkl"] Mar 21 05:44:00 crc kubenswrapper[4580]: E0321 05:44:00.150422 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e06aaf-09aa-47d0-823b-05c80face2d4" containerName="oc" Mar 21 05:44:00 crc kubenswrapper[4580]: I0321 05:44:00.150439 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e06aaf-09aa-47d0-823b-05c80face2d4" containerName="oc" Mar 21 05:44:00 crc kubenswrapper[4580]: I0321 05:44:00.150685 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e06aaf-09aa-47d0-823b-05c80face2d4" containerName="oc" Mar 21 05:44:00 crc kubenswrapper[4580]: I0321 05:44:00.151432 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567864-d4qkl" Mar 21 05:44:00 crc kubenswrapper[4580]: I0321 05:44:00.154234 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:44:00 crc kubenswrapper[4580]: I0321 05:44:00.154738 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:44:00 crc kubenswrapper[4580]: I0321 05:44:00.160107 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:44:00 crc kubenswrapper[4580]: I0321 05:44:00.171057 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567864-d4qkl"] Mar 21 05:44:00 crc kubenswrapper[4580]: I0321 05:44:00.336591 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv4lc\" (UniqueName: \"kubernetes.io/projected/48ad4a5c-b320-48f3-b044-b2626b51e069-kube-api-access-dv4lc\") pod \"auto-csr-approver-29567864-d4qkl\" (UID: \"48ad4a5c-b320-48f3-b044-b2626b51e069\") " pod="openshift-infra/auto-csr-approver-29567864-d4qkl" Mar 21 05:44:00 crc kubenswrapper[4580]: I0321 05:44:00.438361 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv4lc\" (UniqueName: \"kubernetes.io/projected/48ad4a5c-b320-48f3-b044-b2626b51e069-kube-api-access-dv4lc\") pod \"auto-csr-approver-29567864-d4qkl\" (UID: \"48ad4a5c-b320-48f3-b044-b2626b51e069\") " pod="openshift-infra/auto-csr-approver-29567864-d4qkl" Mar 21 05:44:00 crc kubenswrapper[4580]: I0321 05:44:00.460982 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv4lc\" (UniqueName: \"kubernetes.io/projected/48ad4a5c-b320-48f3-b044-b2626b51e069-kube-api-access-dv4lc\") pod \"auto-csr-approver-29567864-d4qkl\" (UID: \"48ad4a5c-b320-48f3-b044-b2626b51e069\") " pod="openshift-infra/auto-csr-approver-29567864-d4qkl" Mar 21 05:44:00 crc kubenswrapper[4580]: I0321 05:44:00.477573 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567864-d4qkl" Mar 21 05:44:00 crc kubenswrapper[4580]: I0321 05:44:00.950727 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567864-d4qkl"] Mar 21 05:44:01 crc kubenswrapper[4580]: I0321 05:44:01.131369 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567864-d4qkl" event={"ID":"48ad4a5c-b320-48f3-b044-b2626b51e069","Type":"ContainerStarted","Data":"dacfcf7b7c086516a71f8ce500bab0e11d1de165d810bd1ffe9b5862a0bdda9e"} Mar 21 05:44:03 crc kubenswrapper[4580]: I0321 05:44:03.150821 4580 generic.go:334] "Generic (PLEG): container finished" podID="48ad4a5c-b320-48f3-b044-b2626b51e069" containerID="e4639720487461648ce28a72053f5fac482eda9612d2425823ae586a6c1dbcd7" exitCode=0 Mar 21 05:44:03 crc kubenswrapper[4580]: I0321 05:44:03.150923 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567864-d4qkl" event={"ID":"48ad4a5c-b320-48f3-b044-b2626b51e069","Type":"ContainerDied","Data":"e4639720487461648ce28a72053f5fac482eda9612d2425823ae586a6c1dbcd7"} Mar 21 05:44:04 crc kubenswrapper[4580]: I0321 05:44:04.547190 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567864-d4qkl" Mar 21 05:44:04 crc kubenswrapper[4580]: I0321 05:44:04.724328 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv4lc\" (UniqueName: \"kubernetes.io/projected/48ad4a5c-b320-48f3-b044-b2626b51e069-kube-api-access-dv4lc\") pod \"48ad4a5c-b320-48f3-b044-b2626b51e069\" (UID: \"48ad4a5c-b320-48f3-b044-b2626b51e069\") " Mar 21 05:44:04 crc kubenswrapper[4580]: I0321 05:44:04.730240 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48ad4a5c-b320-48f3-b044-b2626b51e069-kube-api-access-dv4lc" (OuterVolumeSpecName: "kube-api-access-dv4lc") pod "48ad4a5c-b320-48f3-b044-b2626b51e069" (UID: "48ad4a5c-b320-48f3-b044-b2626b51e069"). InnerVolumeSpecName "kube-api-access-dv4lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:44:04 crc kubenswrapper[4580]: I0321 05:44:04.827130 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv4lc\" (UniqueName: \"kubernetes.io/projected/48ad4a5c-b320-48f3-b044-b2626b51e069-kube-api-access-dv4lc\") on node \"crc\" DevicePath \"\"" Mar 21 05:44:05 crc kubenswrapper[4580]: I0321 05:44:05.169984 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567864-d4qkl" event={"ID":"48ad4a5c-b320-48f3-b044-b2626b51e069","Type":"ContainerDied","Data":"dacfcf7b7c086516a71f8ce500bab0e11d1de165d810bd1ffe9b5862a0bdda9e"} Mar 21 05:44:05 crc kubenswrapper[4580]: I0321 05:44:05.170242 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dacfcf7b7c086516a71f8ce500bab0e11d1de165d810bd1ffe9b5862a0bdda9e" Mar 21 05:44:05 crc kubenswrapper[4580]: I0321 05:44:05.170097 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567864-d4qkl" Mar 21 05:44:05 crc kubenswrapper[4580]: I0321 05:44:05.647811 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567858-g2wsm"] Mar 21 05:44:05 crc kubenswrapper[4580]: I0321 05:44:05.655561 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567858-g2wsm"] Mar 21 05:44:07 crc kubenswrapper[4580]: I0321 05:44:07.632605 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4bd29d6-3fa8-44dd-993a-47feb6717d75" path="/var/lib/kubelet/pods/e4bd29d6-3fa8-44dd-993a-47feb6717d75/volumes" Mar 21 05:44:14 crc kubenswrapper[4580]: I0321 05:44:14.935691 4580 scope.go:117] "RemoveContainer" containerID="dc703c7158f720e2843927c66b82bae55b4756500eef5f915346390e83ebfd00" Mar 21 05:44:45 crc kubenswrapper[4580]: I0321 05:44:45.948258 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:44:45 crc kubenswrapper[4580]: I0321 05:44:45.948807 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.152436 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl"] Mar 21 05:45:00 crc kubenswrapper[4580]: E0321 05:45:00.153354 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ad4a5c-b320-48f3-b044-b2626b51e069" containerName="oc" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.153369 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ad4a5c-b320-48f3-b044-b2626b51e069" containerName="oc" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.153555 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ad4a5c-b320-48f3-b044-b2626b51e069" containerName="oc" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.154242 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.161336 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.161528 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.194602 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl"] Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.250464 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fd4aaee-0552-4192-89cb-b361139fd76e-config-volume\") pod \"collect-profiles-29567865-p99tl\" (UID: \"4fd4aaee-0552-4192-89cb-b361139fd76e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.250539 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fd4aaee-0552-4192-89cb-b361139fd76e-secret-volume\") pod \"collect-profiles-29567865-p99tl\" (UID: \"4fd4aaee-0552-4192-89cb-b361139fd76e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.250569 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vt6f\" (UniqueName: \"kubernetes.io/projected/4fd4aaee-0552-4192-89cb-b361139fd76e-kube-api-access-2vt6f\") pod \"collect-profiles-29567865-p99tl\" (UID: \"4fd4aaee-0552-4192-89cb-b361139fd76e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.352489 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fd4aaee-0552-4192-89cb-b361139fd76e-config-volume\") pod \"collect-profiles-29567865-p99tl\" (UID: \"4fd4aaee-0552-4192-89cb-b361139fd76e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.352551 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fd4aaee-0552-4192-89cb-b361139fd76e-secret-volume\") pod \"collect-profiles-29567865-p99tl\" (UID: \"4fd4aaee-0552-4192-89cb-b361139fd76e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.352614 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vt6f\" (UniqueName: \"kubernetes.io/projected/4fd4aaee-0552-4192-89cb-b361139fd76e-kube-api-access-2vt6f\") pod \"collect-profiles-29567865-p99tl\" (UID: \"4fd4aaee-0552-4192-89cb-b361139fd76e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.353549 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fd4aaee-0552-4192-89cb-b361139fd76e-config-volume\") pod \"collect-profiles-29567865-p99tl\" (UID: \"4fd4aaee-0552-4192-89cb-b361139fd76e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.363874 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fd4aaee-0552-4192-89cb-b361139fd76e-secret-volume\") pod \"collect-profiles-29567865-p99tl\" (UID: \"4fd4aaee-0552-4192-89cb-b361139fd76e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.379501 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vt6f\" (UniqueName: \"kubernetes.io/projected/4fd4aaee-0552-4192-89cb-b361139fd76e-kube-api-access-2vt6f\") pod \"collect-profiles-29567865-p99tl\" (UID: \"4fd4aaee-0552-4192-89cb-b361139fd76e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" Mar 21 05:45:00 crc kubenswrapper[4580]: I0321 05:45:00.475026 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" Mar 21 05:45:01 crc kubenswrapper[4580]: I0321 05:45:01.201893 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl"] Mar 21 05:45:01 crc kubenswrapper[4580]: I0321 05:45:01.693084 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" event={"ID":"4fd4aaee-0552-4192-89cb-b361139fd76e","Type":"ContainerStarted","Data":"490081d11321c3f7392f4b54104435f348c93251c2ee0398c49b27f75559cc67"} Mar 21 05:45:01 crc kubenswrapper[4580]: I0321 05:45:01.693419 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" event={"ID":"4fd4aaee-0552-4192-89cb-b361139fd76e","Type":"ContainerStarted","Data":"239bffd9bc2d4202b53b08433c234c6dfea81e12a57af9de3793b1c216805487"} Mar 21 05:45:01 crc kubenswrapper[4580]: I0321 05:45:01.715174 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" podStartSLOduration=1.71515606 podStartE2EDuration="1.71515606s" podCreationTimestamp="2026-03-21 05:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:45:01.71218396 +0000 UTC m=+3206.794767608" watchObservedRunningTime="2026-03-21 05:45:01.71515606 +0000 UTC m=+3206.797739688" Mar 21 05:45:02 crc kubenswrapper[4580]: I0321 05:45:02.722153 4580 generic.go:334] "Generic (PLEG): container finished" podID="4fd4aaee-0552-4192-89cb-b361139fd76e" containerID="490081d11321c3f7392f4b54104435f348c93251c2ee0398c49b27f75559cc67" exitCode=0 Mar 21 05:45:02 crc kubenswrapper[4580]: I0321 05:45:02.722730 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" event={"ID":"4fd4aaee-0552-4192-89cb-b361139fd76e","Type":"ContainerDied","Data":"490081d11321c3f7392f4b54104435f348c93251c2ee0398c49b27f75559cc67"} Mar 21 05:45:04 crc kubenswrapper[4580]: I0321 05:45:04.200587 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" Mar 21 05:45:04 crc kubenswrapper[4580]: I0321 05:45:04.264435 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vt6f\" (UniqueName: \"kubernetes.io/projected/4fd4aaee-0552-4192-89cb-b361139fd76e-kube-api-access-2vt6f\") pod \"4fd4aaee-0552-4192-89cb-b361139fd76e\" (UID: \"4fd4aaee-0552-4192-89cb-b361139fd76e\") " Mar 21 05:45:04 crc kubenswrapper[4580]: I0321 05:45:04.268182 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fd4aaee-0552-4192-89cb-b361139fd76e-secret-volume\") pod \"4fd4aaee-0552-4192-89cb-b361139fd76e\" (UID: \"4fd4aaee-0552-4192-89cb-b361139fd76e\") " Mar 21 05:45:04 crc kubenswrapper[4580]: I0321 05:45:04.268703 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fd4aaee-0552-4192-89cb-b361139fd76e-config-volume\") pod \"4fd4aaee-0552-4192-89cb-b361139fd76e\" (UID: \"4fd4aaee-0552-4192-89cb-b361139fd76e\") " Mar 21 05:45:04 crc kubenswrapper[4580]: I0321 05:45:04.269446 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fd4aaee-0552-4192-89cb-b361139fd76e-config-volume" (OuterVolumeSpecName: "config-volume") pod "4fd4aaee-0552-4192-89cb-b361139fd76e" (UID: "4fd4aaee-0552-4192-89cb-b361139fd76e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:45:04 crc kubenswrapper[4580]: I0321 05:45:04.270964 4580 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fd4aaee-0552-4192-89cb-b361139fd76e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:45:04 crc kubenswrapper[4580]: I0321 05:45:04.277954 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd4aaee-0552-4192-89cb-b361139fd76e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4fd4aaee-0552-4192-89cb-b361139fd76e" (UID: "4fd4aaee-0552-4192-89cb-b361139fd76e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:45:04 crc kubenswrapper[4580]: I0321 05:45:04.278801 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fd4aaee-0552-4192-89cb-b361139fd76e-kube-api-access-2vt6f" (OuterVolumeSpecName: "kube-api-access-2vt6f") pod "4fd4aaee-0552-4192-89cb-b361139fd76e" (UID: "4fd4aaee-0552-4192-89cb-b361139fd76e"). InnerVolumeSpecName "kube-api-access-2vt6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:45:04 crc kubenswrapper[4580]: I0321 05:45:04.373516 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vt6f\" (UniqueName: \"kubernetes.io/projected/4fd4aaee-0552-4192-89cb-b361139fd76e-kube-api-access-2vt6f\") on node \"crc\" DevicePath \"\"" Mar 21 05:45:04 crc kubenswrapper[4580]: I0321 05:45:04.374040 4580 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fd4aaee-0552-4192-89cb-b361139fd76e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:45:04 crc kubenswrapper[4580]: I0321 05:45:04.745680 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" event={"ID":"4fd4aaee-0552-4192-89cb-b361139fd76e","Type":"ContainerDied","Data":"239bffd9bc2d4202b53b08433c234c6dfea81e12a57af9de3793b1c216805487"} Mar 21 05:45:04 crc kubenswrapper[4580]: I0321 05:45:04.745718 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="239bffd9bc2d4202b53b08433c234c6dfea81e12a57af9de3793b1c216805487" Mar 21 05:45:04 crc kubenswrapper[4580]: I0321 05:45:04.745740 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-p99tl" Mar 21 05:45:05 crc kubenswrapper[4580]: I0321 05:45:05.291056 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98"] Mar 21 05:45:05 crc kubenswrapper[4580]: I0321 05:45:05.299101 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-t8d98"] Mar 21 05:45:05 crc kubenswrapper[4580]: I0321 05:45:05.632180 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89ff7df0-d49d-4787-bdaf-145ef7647123" path="/var/lib/kubelet/pods/89ff7df0-d49d-4787-bdaf-145ef7647123/volumes" Mar 21 05:45:15 crc kubenswrapper[4580]: I0321 05:45:15.024620 4580 scope.go:117] "RemoveContainer" containerID="05ae03480dc5f9c4472df88544f2498e8c3906b54ee45ff1be0d994d971c4017" Mar 21 05:45:15 crc kubenswrapper[4580]: I0321 05:45:15.947928 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:45:15 crc kubenswrapper[4580]: I0321 05:45:15.947986 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:45:40 crc kubenswrapper[4580]: I0321 05:45:40.032833 4580 generic.go:334] "Generic (PLEG): container finished" podID="c692d589-bfb1-449b-91ff-8517954bc204" containerID="c0079af9200fe479c0ce3e39ebc2881e7686e8c7feba857b69304e3faaf00a5e" exitCode=0 Mar 21 05:45:40 crc kubenswrapper[4580]: I0321 05:45:40.032868 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c692d589-bfb1-449b-91ff-8517954bc204","Type":"ContainerDied","Data":"c0079af9200fe479c0ce3e39ebc2881e7686e8c7feba857b69304e3faaf00a5e"} Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.461263 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.603387 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c692d589-bfb1-449b-91ff-8517954bc204-test-operator-ephemeral-temporary\") pod \"c692d589-bfb1-449b-91ff-8517954bc204\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.603443 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c692d589-bfb1-449b-91ff-8517954bc204-test-operator-ephemeral-workdir\") pod \"c692d589-bfb1-449b-91ff-8517954bc204\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.603518 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c692d589-bfb1-449b-91ff-8517954bc204-config-data\") pod \"c692d589-bfb1-449b-91ff-8517954bc204\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.603550 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btms6\" (UniqueName: \"kubernetes.io/projected/c692d589-bfb1-449b-91ff-8517954bc204-kube-api-access-btms6\") pod \"c692d589-bfb1-449b-91ff-8517954bc204\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.603614 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-openstack-config-secret\") pod \"c692d589-bfb1-449b-91ff-8517954bc204\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.603641 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c692d589-bfb1-449b-91ff-8517954bc204-openstack-config\") pod \"c692d589-bfb1-449b-91ff-8517954bc204\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.603670 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-ssh-key\") pod \"c692d589-bfb1-449b-91ff-8517954bc204\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.603729 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"c692d589-bfb1-449b-91ff-8517954bc204\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.603811 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-ca-certs\") pod \"c692d589-bfb1-449b-91ff-8517954bc204\" (UID: \"c692d589-bfb1-449b-91ff-8517954bc204\") " Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.604176 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c692d589-bfb1-449b-91ff-8517954bc204-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "c692d589-bfb1-449b-91ff-8517954bc204" (UID: "c692d589-bfb1-449b-91ff-8517954bc204"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.604294 4580 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c692d589-bfb1-449b-91ff-8517954bc204-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.604585 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c692d589-bfb1-449b-91ff-8517954bc204-config-data" (OuterVolumeSpecName: "config-data") pod "c692d589-bfb1-449b-91ff-8517954bc204" (UID: "c692d589-bfb1-449b-91ff-8517954bc204"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.610198 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "c692d589-bfb1-449b-91ff-8517954bc204" (UID: "c692d589-bfb1-449b-91ff-8517954bc204"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.610588 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c692d589-bfb1-449b-91ff-8517954bc204-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "c692d589-bfb1-449b-91ff-8517954bc204" (UID: "c692d589-bfb1-449b-91ff-8517954bc204"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.622536 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c692d589-bfb1-449b-91ff-8517954bc204-kube-api-access-btms6" (OuterVolumeSpecName: "kube-api-access-btms6") pod "c692d589-bfb1-449b-91ff-8517954bc204" (UID: "c692d589-bfb1-449b-91ff-8517954bc204"). InnerVolumeSpecName "kube-api-access-btms6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.633175 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c692d589-bfb1-449b-91ff-8517954bc204" (UID: "c692d589-bfb1-449b-91ff-8517954bc204"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.638220 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c692d589-bfb1-449b-91ff-8517954bc204" (UID: "c692d589-bfb1-449b-91ff-8517954bc204"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.640971 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "c692d589-bfb1-449b-91ff-8517954bc204" (UID: "c692d589-bfb1-449b-91ff-8517954bc204"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.655761 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c692d589-bfb1-449b-91ff-8517954bc204-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c692d589-bfb1-449b-91ff-8517954bc204" (UID: "c692d589-bfb1-449b-91ff-8517954bc204"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.706177 4580 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.706213 4580 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.706226 4580 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c692d589-bfb1-449b-91ff-8517954bc204-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.706237 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c692d589-bfb1-449b-91ff-8517954bc204-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.706246 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btms6\" (UniqueName: \"kubernetes.io/projected/c692d589-bfb1-449b-91ff-8517954bc204-kube-api-access-btms6\") on node \"crc\" DevicePath \"\"" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.706255 4580 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.706266 4580 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c692d589-bfb1-449b-91ff-8517954bc204-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.706274 4580 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c692d589-bfb1-449b-91ff-8517954bc204-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.724961 4580 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 21 05:45:41 crc kubenswrapper[4580]: I0321 05:45:41.808199 4580 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:45:42 crc kubenswrapper[4580]: I0321 05:45:42.051984 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c692d589-bfb1-449b-91ff-8517954bc204","Type":"ContainerDied","Data":"62a20f66967e1805bb8e21546249cf71c6205b0869ebe96e21a378c4c4b1b0c8"} Mar 21 05:45:42 crc kubenswrapper[4580]: I0321 05:45:42.052357 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62a20f66967e1805bb8e21546249cf71c6205b0869ebe96e21a378c4c4b1b0c8" Mar 21 05:45:42 crc kubenswrapper[4580]: I0321 05:45:42.052030 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:45:45 crc kubenswrapper[4580]: I0321 05:45:45.947759 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:45:45 crc kubenswrapper[4580]: I0321 05:45:45.948331 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:45:45 crc kubenswrapper[4580]: I0321 05:45:45.948373 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 05:45:45 crc kubenswrapper[4580]: I0321 05:45:45.949298 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886"} pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:45:45 crc kubenswrapper[4580]: I0321 05:45:45.949360 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" containerID="cri-o://99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" gracePeriod=600 Mar 21 05:45:46 crc kubenswrapper[4580]: E0321 05:45:46.073910 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:45:46 crc kubenswrapper[4580]: I0321 05:45:46.086821 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" exitCode=0 Mar 21 05:45:46 crc kubenswrapper[4580]: I0321 05:45:46.086887 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerDied","Data":"99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886"} Mar 21 05:45:46 crc kubenswrapper[4580]: I0321 05:45:46.086946 4580 scope.go:117] "RemoveContainer" containerID="7da25fc56e97732a5d1594544cf7d6bd189dbc61b5ca3f85d33f7ac4406a856c" Mar 21 05:45:46 crc kubenswrapper[4580]: I0321 05:45:46.088100 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:45:46 crc kubenswrapper[4580]: E0321 05:45:46.088516 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.313495 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 21 05:45:55 crc kubenswrapper[4580]: E0321 05:45:55.314564 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c692d589-bfb1-449b-91ff-8517954bc204" containerName="tempest-tests-tempest-tests-runner" Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.314579 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c692d589-bfb1-449b-91ff-8517954bc204" containerName="tempest-tests-tempest-tests-runner" Mar 21 05:45:55 crc kubenswrapper[4580]: E0321 05:45:55.314611 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd4aaee-0552-4192-89cb-b361139fd76e" containerName="collect-profiles" Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.314620 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd4aaee-0552-4192-89cb-b361139fd76e" containerName="collect-profiles" Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.314865 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="c692d589-bfb1-449b-91ff-8517954bc204" containerName="tempest-tests-tempest-tests-runner" Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.314885 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd4aaee-0552-4192-89cb-b361139fd76e" containerName="collect-profiles" Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.315577 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.319331 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-khfnw" Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.327967 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.373939 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"25fd8fca-3d1d-4c2e-af01-c5ca004814fd\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.373991 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqhw4\" (UniqueName: \"kubernetes.io/projected/25fd8fca-3d1d-4c2e-af01-c5ca004814fd-kube-api-access-cqhw4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"25fd8fca-3d1d-4c2e-af01-c5ca004814fd\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.475882 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"25fd8fca-3d1d-4c2e-af01-c5ca004814fd\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.475965 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqhw4\" (UniqueName: \"kubernetes.io/projected/25fd8fca-3d1d-4c2e-af01-c5ca004814fd-kube-api-access-cqhw4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"25fd8fca-3d1d-4c2e-af01-c5ca004814fd\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.476855 4580 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"25fd8fca-3d1d-4c2e-af01-c5ca004814fd\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.493377 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqhw4\" (UniqueName: \"kubernetes.io/projected/25fd8fca-3d1d-4c2e-af01-c5ca004814fd-kube-api-access-cqhw4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"25fd8fca-3d1d-4c2e-af01-c5ca004814fd\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.514181 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"25fd8fca-3d1d-4c2e-af01-c5ca004814fd\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:45:55 crc kubenswrapper[4580]: I0321 05:45:55.637664 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:45:56 crc kubenswrapper[4580]: I0321 05:45:56.078405 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 21 05:45:56 crc kubenswrapper[4580]: I0321 05:45:56.079364 4580 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:45:56 crc kubenswrapper[4580]: I0321 05:45:56.189727 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"25fd8fca-3d1d-4c2e-af01-c5ca004814fd","Type":"ContainerStarted","Data":"deea4edb62b7345e6a1164e01c529e255bbb011b48ee937a3314515e4986eca3"} Mar 21 05:45:57 crc kubenswrapper[4580]: I0321 05:45:57.618493 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:45:57 crc kubenswrapper[4580]: E0321 05:45:57.619219 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:45:58 crc kubenswrapper[4580]: I0321 05:45:58.216934 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"25fd8fca-3d1d-4c2e-af01-c5ca004814fd","Type":"ContainerStarted","Data":"08732ae75ec81f3a95da6acd92aca58f9f317f639f3727ba500dc2e2c445a4d6"} Mar 21 05:45:58 crc kubenswrapper[4580]: I0321 05:45:58.246977 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.191096864 podStartE2EDuration="3.246955321s" podCreationTimestamp="2026-03-21 05:45:55 +0000 UTC" firstStartedPulling="2026-03-21 05:45:56.079067906 +0000 UTC m=+3261.161651544" lastFinishedPulling="2026-03-21 05:45:57.134926373 +0000 UTC m=+3262.217510001" observedRunningTime="2026-03-21 05:45:58.238851723 +0000 UTC m=+3263.321435411" watchObservedRunningTime="2026-03-21 05:45:58.246955321 +0000 UTC m=+3263.329538969" Mar 21 05:46:00 crc kubenswrapper[4580]: I0321 05:46:00.143381 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567866-t9zwb"] Mar 21 05:46:00 crc kubenswrapper[4580]: I0321 05:46:00.144809 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567866-t9zwb" Mar 21 05:46:00 crc kubenswrapper[4580]: I0321 05:46:00.147382 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:46:00 crc kubenswrapper[4580]: I0321 05:46:00.147704 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:46:00 crc kubenswrapper[4580]: I0321 05:46:00.147726 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:46:00 crc kubenswrapper[4580]: I0321 05:46:00.158479 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567866-t9zwb"] Mar 21 05:46:00 crc kubenswrapper[4580]: I0321 05:46:00.264255 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l266\" (UniqueName: \"kubernetes.io/projected/01138593-99cf-4cf0-b9bc-c9c8fb742e20-kube-api-access-4l266\") pod \"auto-csr-approver-29567866-t9zwb\" (UID: \"01138593-99cf-4cf0-b9bc-c9c8fb742e20\") " pod="openshift-infra/auto-csr-approver-29567866-t9zwb" Mar 21 05:46:00 crc kubenswrapper[4580]: I0321 05:46:00.365477 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l266\" (UniqueName: \"kubernetes.io/projected/01138593-99cf-4cf0-b9bc-c9c8fb742e20-kube-api-access-4l266\") pod \"auto-csr-approver-29567866-t9zwb\" (UID: \"01138593-99cf-4cf0-b9bc-c9c8fb742e20\") " pod="openshift-infra/auto-csr-approver-29567866-t9zwb" Mar 21 05:46:00 crc kubenswrapper[4580]: I0321 05:46:00.390725 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l266\" (UniqueName: \"kubernetes.io/projected/01138593-99cf-4cf0-b9bc-c9c8fb742e20-kube-api-access-4l266\") pod \"auto-csr-approver-29567866-t9zwb\" (UID: \"01138593-99cf-4cf0-b9bc-c9c8fb742e20\") " pod="openshift-infra/auto-csr-approver-29567866-t9zwb" Mar 21 05:46:00 crc kubenswrapper[4580]: I0321 05:46:00.475913 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567866-t9zwb" Mar 21 05:46:00 crc kubenswrapper[4580]: I0321 05:46:00.930827 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567866-t9zwb"] Mar 21 05:46:00 crc kubenswrapper[4580]: W0321 05:46:00.933734 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01138593_99cf_4cf0_b9bc_c9c8fb742e20.slice/crio-32d928a8017b1648555282fd7f648b6faf5e417da09e3291059fbf99d1bf6dda WatchSource:0}: Error finding container 32d928a8017b1648555282fd7f648b6faf5e417da09e3291059fbf99d1bf6dda: Status 404 returned error can't find the container with id 32d928a8017b1648555282fd7f648b6faf5e417da09e3291059fbf99d1bf6dda Mar 21 05:46:01 crc kubenswrapper[4580]: I0321 05:46:01.241610 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567866-t9zwb" event={"ID":"01138593-99cf-4cf0-b9bc-c9c8fb742e20","Type":"ContainerStarted","Data":"32d928a8017b1648555282fd7f648b6faf5e417da09e3291059fbf99d1bf6dda"} Mar 21 05:46:02 crc kubenswrapper[4580]: I0321 05:46:02.251989 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567866-t9zwb" event={"ID":"01138593-99cf-4cf0-b9bc-c9c8fb742e20","Type":"ContainerStarted","Data":"a187d6d091ab7be4b5941e90c19f8202c786c4c7a7a17e9cc93e6799c52ea5f7"} Mar 21 05:46:02 crc kubenswrapper[4580]: I0321 05:46:02.266577 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567866-t9zwb" podStartSLOduration=1.449408311 podStartE2EDuration="2.266555812s" podCreationTimestamp="2026-03-21 05:46:00 +0000 UTC" firstStartedPulling="2026-03-21 05:46:00.936321992 +0000 UTC m=+3266.018905620" lastFinishedPulling="2026-03-21 05:46:01.753469483 +0000 UTC m=+3266.836053121" observedRunningTime="2026-03-21 05:46:02.263375566 +0000 UTC m=+3267.345959194" watchObservedRunningTime="2026-03-21 05:46:02.266555812 +0000 UTC m=+3267.349139450" Mar 21 05:46:03 crc kubenswrapper[4580]: I0321 05:46:03.263016 4580 generic.go:334] "Generic (PLEG): container finished" podID="01138593-99cf-4cf0-b9bc-c9c8fb742e20" containerID="a187d6d091ab7be4b5941e90c19f8202c786c4c7a7a17e9cc93e6799c52ea5f7" exitCode=0 Mar 21 05:46:03 crc kubenswrapper[4580]: I0321 05:46:03.263081 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567866-t9zwb" event={"ID":"01138593-99cf-4cf0-b9bc-c9c8fb742e20","Type":"ContainerDied","Data":"a187d6d091ab7be4b5941e90c19f8202c786c4c7a7a17e9cc93e6799c52ea5f7"} Mar 21 05:46:04 crc kubenswrapper[4580]: I0321 05:46:04.593112 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567866-t9zwb" Mar 21 05:46:04 crc kubenswrapper[4580]: I0321 05:46:04.758776 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l266\" (UniqueName: \"kubernetes.io/projected/01138593-99cf-4cf0-b9bc-c9c8fb742e20-kube-api-access-4l266\") pod \"01138593-99cf-4cf0-b9bc-c9c8fb742e20\" (UID: \"01138593-99cf-4cf0-b9bc-c9c8fb742e20\") " Mar 21 05:46:04 crc kubenswrapper[4580]: I0321 05:46:04.771253 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01138593-99cf-4cf0-b9bc-c9c8fb742e20-kube-api-access-4l266" (OuterVolumeSpecName: "kube-api-access-4l266") pod "01138593-99cf-4cf0-b9bc-c9c8fb742e20" (UID: "01138593-99cf-4cf0-b9bc-c9c8fb742e20"). InnerVolumeSpecName "kube-api-access-4l266". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:46:04 crc kubenswrapper[4580]: I0321 05:46:04.862849 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l266\" (UniqueName: \"kubernetes.io/projected/01138593-99cf-4cf0-b9bc-c9c8fb742e20-kube-api-access-4l266\") on node \"crc\" DevicePath \"\"" Mar 21 05:46:05 crc kubenswrapper[4580]: I0321 05:46:05.286119 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567866-t9zwb" event={"ID":"01138593-99cf-4cf0-b9bc-c9c8fb742e20","Type":"ContainerDied","Data":"32d928a8017b1648555282fd7f648b6faf5e417da09e3291059fbf99d1bf6dda"} Mar 21 05:46:05 crc kubenswrapper[4580]: I0321 05:46:05.286497 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32d928a8017b1648555282fd7f648b6faf5e417da09e3291059fbf99d1bf6dda" Mar 21 05:46:05 crc kubenswrapper[4580]: I0321 05:46:05.286582 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567866-t9zwb" Mar 21 05:46:05 crc kubenswrapper[4580]: I0321 05:46:05.348629 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567860-k7fsb"] Mar 21 05:46:05 crc kubenswrapper[4580]: I0321 05:46:05.357275 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567860-k7fsb"] Mar 21 05:46:05 crc kubenswrapper[4580]: I0321 05:46:05.628206 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c276b9ba-801d-47b2-b1c9-2e31f66fb4e7" path="/var/lib/kubelet/pods/c276b9ba-801d-47b2-b1c9-2e31f66fb4e7/volumes" Mar 21 05:46:08 crc kubenswrapper[4580]: I0321 05:46:08.618129 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:46:08 crc kubenswrapper[4580]: E0321 05:46:08.618599 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:46:15 crc kubenswrapper[4580]: I0321 05:46:15.095402 4580 scope.go:117] "RemoveContainer" containerID="c1e460ca0be6a9730b568009b7fdf335e61c92ec23e98af46ec287e3a242163f" Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.143158 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bpk7s/must-gather-r67c9"] Mar 21 05:46:19 crc kubenswrapper[4580]: E0321 05:46:19.144097 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01138593-99cf-4cf0-b9bc-c9c8fb742e20" containerName="oc" Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.144116 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="01138593-99cf-4cf0-b9bc-c9c8fb742e20" containerName="oc" Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.144326 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="01138593-99cf-4cf0-b9bc-c9c8fb742e20" containerName="oc" Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.145447 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/must-gather-r67c9" Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.154168 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-bpk7s"/"default-dockercfg-hq9v8" Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.154410 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bpk7s"/"openshift-service-ca.crt" Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.158640 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c79ccdc7-5b73-4541-8dd6-1d11172e66df-must-gather-output\") pod \"must-gather-r67c9\" (UID: \"c79ccdc7-5b73-4541-8dd6-1d11172e66df\") " pod="openshift-must-gather-bpk7s/must-gather-r67c9" Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.158898 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68cff\" (UniqueName: \"kubernetes.io/projected/c79ccdc7-5b73-4541-8dd6-1d11172e66df-kube-api-access-68cff\") pod \"must-gather-r67c9\" (UID: \"c79ccdc7-5b73-4541-8dd6-1d11172e66df\") " pod="openshift-must-gather-bpk7s/must-gather-r67c9" Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.169124 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bpk7s"/"kube-root-ca.crt" Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.185123 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bpk7s/must-gather-r67c9"] Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.261300 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c79ccdc7-5b73-4541-8dd6-1d11172e66df-must-gather-output\") pod \"must-gather-r67c9\" (UID: \"c79ccdc7-5b73-4541-8dd6-1d11172e66df\") " pod="openshift-must-gather-bpk7s/must-gather-r67c9" Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.261396 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68cff\" (UniqueName: \"kubernetes.io/projected/c79ccdc7-5b73-4541-8dd6-1d11172e66df-kube-api-access-68cff\") pod \"must-gather-r67c9\" (UID: \"c79ccdc7-5b73-4541-8dd6-1d11172e66df\") " pod="openshift-must-gather-bpk7s/must-gather-r67c9" Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.262462 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c79ccdc7-5b73-4541-8dd6-1d11172e66df-must-gather-output\") pod \"must-gather-r67c9\" (UID: \"c79ccdc7-5b73-4541-8dd6-1d11172e66df\") " pod="openshift-must-gather-bpk7s/must-gather-r67c9" Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.296286 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68cff\" (UniqueName: \"kubernetes.io/projected/c79ccdc7-5b73-4541-8dd6-1d11172e66df-kube-api-access-68cff\") pod \"must-gather-r67c9\" (UID: \"c79ccdc7-5b73-4541-8dd6-1d11172e66df\") " pod="openshift-must-gather-bpk7s/must-gather-r67c9" Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.466145 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/must-gather-r67c9" Mar 21 05:46:19 crc kubenswrapper[4580]: I0321 05:46:19.935259 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bpk7s/must-gather-r67c9"] Mar 21 05:46:20 crc kubenswrapper[4580]: I0321 05:46:20.518606 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bpk7s/must-gather-r67c9" event={"ID":"c79ccdc7-5b73-4541-8dd6-1d11172e66df","Type":"ContainerStarted","Data":"c8092de872294cda3f99996eab9012dbed7feb8efb8d3589e9029eccd8464df2"} Mar 21 05:46:20 crc kubenswrapper[4580]: I0321 05:46:20.617403 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:46:20 crc kubenswrapper[4580]: E0321 05:46:20.617973 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:46:24 crc kubenswrapper[4580]: I0321 05:46:24.566189 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bpk7s/must-gather-r67c9" event={"ID":"c79ccdc7-5b73-4541-8dd6-1d11172e66df","Type":"ContainerStarted","Data":"ccddcdbb108133a7b96ae03a5ea07d3849496a89c2321cc44e67f855135b6ebf"} Mar 21 05:46:24 crc kubenswrapper[4580]: I0321 05:46:24.566596 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bpk7s/must-gather-r67c9" event={"ID":"c79ccdc7-5b73-4541-8dd6-1d11172e66df","Type":"ContainerStarted","Data":"1ce1e2bca08ecc44ca9d936dae6d0014fe0f2b346d34eaa186628d26c31616a7"} Mar 21 05:46:24 crc kubenswrapper[4580]: I0321 05:46:24.582412 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bpk7s/must-gather-r67c9" podStartSLOduration=1.4243897300000001 podStartE2EDuration="5.582396816s" podCreationTimestamp="2026-03-21 05:46:19 +0000 UTC" firstStartedPulling="2026-03-21 05:46:19.94398166 +0000 UTC m=+3285.026565308" lastFinishedPulling="2026-03-21 05:46:24.101988766 +0000 UTC m=+3289.184572394" observedRunningTime="2026-03-21 05:46:24.578531642 +0000 UTC m=+3289.661115280" watchObservedRunningTime="2026-03-21 05:46:24.582396816 +0000 UTC m=+3289.664980444" Mar 21 05:46:28 crc kubenswrapper[4580]: I0321 05:46:28.152590 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bpk7s/crc-debug-z5kdb"] Mar 21 05:46:28 crc kubenswrapper[4580]: I0321 05:46:28.154560 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/crc-debug-z5kdb" Mar 21 05:46:28 crc kubenswrapper[4580]: I0321 05:46:28.241597 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqmq5\" (UniqueName: \"kubernetes.io/projected/88274b5f-93e4-4810-a012-a547bd36c04d-kube-api-access-sqmq5\") pod \"crc-debug-z5kdb\" (UID: \"88274b5f-93e4-4810-a012-a547bd36c04d\") " pod="openshift-must-gather-bpk7s/crc-debug-z5kdb" Mar 21 05:46:28 crc kubenswrapper[4580]: I0321 05:46:28.242018 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88274b5f-93e4-4810-a012-a547bd36c04d-host\") pod \"crc-debug-z5kdb\" (UID: \"88274b5f-93e4-4810-a012-a547bd36c04d\") " pod="openshift-must-gather-bpk7s/crc-debug-z5kdb" Mar 21 05:46:28 crc kubenswrapper[4580]: I0321 05:46:28.344397 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqmq5\" (UniqueName: \"kubernetes.io/projected/88274b5f-93e4-4810-a012-a547bd36c04d-kube-api-access-sqmq5\") pod \"crc-debug-z5kdb\" (UID: \"88274b5f-93e4-4810-a012-a547bd36c04d\") " pod="openshift-must-gather-bpk7s/crc-debug-z5kdb" Mar 21 05:46:28 crc kubenswrapper[4580]: I0321 05:46:28.344728 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88274b5f-93e4-4810-a012-a547bd36c04d-host\") pod \"crc-debug-z5kdb\" (UID: \"88274b5f-93e4-4810-a012-a547bd36c04d\") " pod="openshift-must-gather-bpk7s/crc-debug-z5kdb" Mar 21 05:46:28 crc kubenswrapper[4580]: I0321 05:46:28.344857 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88274b5f-93e4-4810-a012-a547bd36c04d-host\") pod \"crc-debug-z5kdb\" (UID: \"88274b5f-93e4-4810-a012-a547bd36c04d\") " pod="openshift-must-gather-bpk7s/crc-debug-z5kdb" Mar 21 05:46:28 crc kubenswrapper[4580]: I0321 05:46:28.369928 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqmq5\" (UniqueName: \"kubernetes.io/projected/88274b5f-93e4-4810-a012-a547bd36c04d-kube-api-access-sqmq5\") pod \"crc-debug-z5kdb\" (UID: \"88274b5f-93e4-4810-a012-a547bd36c04d\") " pod="openshift-must-gather-bpk7s/crc-debug-z5kdb" Mar 21 05:46:28 crc kubenswrapper[4580]: I0321 05:46:28.512296 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/crc-debug-z5kdb" Mar 21 05:46:28 crc kubenswrapper[4580]: I0321 05:46:28.600126 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bpk7s/crc-debug-z5kdb" event={"ID":"88274b5f-93e4-4810-a012-a547bd36c04d","Type":"ContainerStarted","Data":"18c02aa95619fec08f50fa098be95b7572a37913199d52358a5bb71ff00d9056"} Mar 21 05:46:34 crc kubenswrapper[4580]: I0321 05:46:34.618312 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:46:34 crc kubenswrapper[4580]: E0321 05:46:34.619215 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:46:40 crc kubenswrapper[4580]: I0321 05:46:40.722054 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bpk7s/crc-debug-z5kdb" event={"ID":"88274b5f-93e4-4810-a012-a547bd36c04d","Type":"ContainerStarted","Data":"83ee520bd99fefe3b954fdfcbd2c1bfe32a8a7fdfa6540abf93c5a296a8f7bc5"} Mar 21 05:46:40 crc kubenswrapper[4580]: I0321 05:46:40.743322 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bpk7s/crc-debug-z5kdb" podStartSLOduration=1.689261138 podStartE2EDuration="12.743304799s" podCreationTimestamp="2026-03-21 05:46:28 +0000 UTC" firstStartedPulling="2026-03-21 05:46:28.550622514 +0000 UTC m=+3293.633206142" lastFinishedPulling="2026-03-21 05:46:39.604666175 +0000 UTC m=+3304.687249803" observedRunningTime="2026-03-21 05:46:40.742109346 +0000 UTC m=+3305.824692974" watchObservedRunningTime="2026-03-21 05:46:40.743304799 +0000 UTC m=+3305.825888427" Mar 21 05:46:45 crc kubenswrapper[4580]: I0321 05:46:45.623490 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:46:45 crc kubenswrapper[4580]: E0321 05:46:45.624265 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:46:57 crc kubenswrapper[4580]: I0321 05:46:57.618614 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:46:57 crc kubenswrapper[4580]: E0321 05:46:57.619355 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:47:09 crc kubenswrapper[4580]: I0321 05:47:09.620549 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:47:09 crc kubenswrapper[4580]: E0321 05:47:09.621305 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:47:19 crc kubenswrapper[4580]: I0321 05:47:19.058289 4580 generic.go:334] "Generic (PLEG): container finished" podID="88274b5f-93e4-4810-a012-a547bd36c04d" containerID="83ee520bd99fefe3b954fdfcbd2c1bfe32a8a7fdfa6540abf93c5a296a8f7bc5" exitCode=0 Mar 21 05:47:19 crc kubenswrapper[4580]: I0321 05:47:19.058536 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bpk7s/crc-debug-z5kdb" event={"ID":"88274b5f-93e4-4810-a012-a547bd36c04d","Type":"ContainerDied","Data":"83ee520bd99fefe3b954fdfcbd2c1bfe32a8a7fdfa6540abf93c5a296a8f7bc5"} Mar 21 05:47:20 crc kubenswrapper[4580]: I0321 05:47:20.183847 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/crc-debug-z5kdb" Mar 21 05:47:20 crc kubenswrapper[4580]: I0321 05:47:20.214014 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bpk7s/crc-debug-z5kdb"] Mar 21 05:47:20 crc kubenswrapper[4580]: I0321 05:47:20.220972 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bpk7s/crc-debug-z5kdb"] Mar 21 05:47:20 crc kubenswrapper[4580]: I0321 05:47:20.348085 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88274b5f-93e4-4810-a012-a547bd36c04d-host\") pod \"88274b5f-93e4-4810-a012-a547bd36c04d\" (UID: \"88274b5f-93e4-4810-a012-a547bd36c04d\") " Mar 21 05:47:20 crc kubenswrapper[4580]: I0321 05:47:20.348142 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqmq5\" (UniqueName: \"kubernetes.io/projected/88274b5f-93e4-4810-a012-a547bd36c04d-kube-api-access-sqmq5\") pod \"88274b5f-93e4-4810-a012-a547bd36c04d\" (UID: \"88274b5f-93e4-4810-a012-a547bd36c04d\") " Mar 21 05:47:20 crc kubenswrapper[4580]: I0321 05:47:20.348749 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88274b5f-93e4-4810-a012-a547bd36c04d-host" (OuterVolumeSpecName: "host") pod "88274b5f-93e4-4810-a012-a547bd36c04d" (UID: "88274b5f-93e4-4810-a012-a547bd36c04d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:47:20 crc kubenswrapper[4580]: I0321 05:47:20.358485 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88274b5f-93e4-4810-a012-a547bd36c04d-kube-api-access-sqmq5" (OuterVolumeSpecName: "kube-api-access-sqmq5") pod "88274b5f-93e4-4810-a012-a547bd36c04d" (UID: "88274b5f-93e4-4810-a012-a547bd36c04d"). InnerVolumeSpecName "kube-api-access-sqmq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:47:20 crc kubenswrapper[4580]: I0321 05:47:20.449993 4580 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88274b5f-93e4-4810-a012-a547bd36c04d-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:47:20 crc kubenswrapper[4580]: I0321 05:47:20.450017 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqmq5\" (UniqueName: \"kubernetes.io/projected/88274b5f-93e4-4810-a012-a547bd36c04d-kube-api-access-sqmq5\") on node \"crc\" DevicePath \"\"" Mar 21 05:47:21 crc kubenswrapper[4580]: I0321 05:47:21.081927 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18c02aa95619fec08f50fa098be95b7572a37913199d52358a5bb71ff00d9056" Mar 21 05:47:21 crc kubenswrapper[4580]: I0321 05:47:21.082011 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/crc-debug-z5kdb" Mar 21 05:47:21 crc kubenswrapper[4580]: I0321 05:47:21.377646 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bpk7s/crc-debug-wl76t"] Mar 21 05:47:21 crc kubenswrapper[4580]: E0321 05:47:21.378495 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88274b5f-93e4-4810-a012-a547bd36c04d" containerName="container-00" Mar 21 05:47:21 crc kubenswrapper[4580]: I0321 05:47:21.378517 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="88274b5f-93e4-4810-a012-a547bd36c04d" containerName="container-00" Mar 21 05:47:21 crc kubenswrapper[4580]: I0321 05:47:21.378889 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="88274b5f-93e4-4810-a012-a547bd36c04d" containerName="container-00" Mar 21 05:47:21 crc kubenswrapper[4580]: I0321 05:47:21.379849 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/crc-debug-wl76t" Mar 21 05:47:21 crc kubenswrapper[4580]: I0321 05:47:21.472423 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4rzl\" (UniqueName: \"kubernetes.io/projected/cf35e99d-286f-4451-a343-77f364433024-kube-api-access-v4rzl\") pod \"crc-debug-wl76t\" (UID: \"cf35e99d-286f-4451-a343-77f364433024\") " pod="openshift-must-gather-bpk7s/crc-debug-wl76t" Mar 21 05:47:21 crc kubenswrapper[4580]: I0321 05:47:21.472677 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf35e99d-286f-4451-a343-77f364433024-host\") pod \"crc-debug-wl76t\" (UID: \"cf35e99d-286f-4451-a343-77f364433024\") " pod="openshift-must-gather-bpk7s/crc-debug-wl76t" Mar 21 05:47:21 crc kubenswrapper[4580]: I0321 05:47:21.575148 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4rzl\" (UniqueName: \"kubernetes.io/projected/cf35e99d-286f-4451-a343-77f364433024-kube-api-access-v4rzl\") pod \"crc-debug-wl76t\" (UID: \"cf35e99d-286f-4451-a343-77f364433024\") " pod="openshift-must-gather-bpk7s/crc-debug-wl76t" Mar 21 05:47:21 crc kubenswrapper[4580]: I0321 05:47:21.575472 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf35e99d-286f-4451-a343-77f364433024-host\") pod \"crc-debug-wl76t\" (UID: \"cf35e99d-286f-4451-a343-77f364433024\") " pod="openshift-must-gather-bpk7s/crc-debug-wl76t" Mar 21 05:47:21 crc kubenswrapper[4580]: I0321 05:47:21.575516 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf35e99d-286f-4451-a343-77f364433024-host\") pod \"crc-debug-wl76t\" (UID: \"cf35e99d-286f-4451-a343-77f364433024\") " pod="openshift-must-gather-bpk7s/crc-debug-wl76t" Mar 21 05:47:21 crc kubenswrapper[4580]: I0321 05:47:21.608495 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4rzl\" (UniqueName: \"kubernetes.io/projected/cf35e99d-286f-4451-a343-77f364433024-kube-api-access-v4rzl\") pod \"crc-debug-wl76t\" (UID: \"cf35e99d-286f-4451-a343-77f364433024\") " pod="openshift-must-gather-bpk7s/crc-debug-wl76t" Mar 21 05:47:21 crc kubenswrapper[4580]: I0321 05:47:21.648474 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88274b5f-93e4-4810-a012-a547bd36c04d" path="/var/lib/kubelet/pods/88274b5f-93e4-4810-a012-a547bd36c04d/volumes" Mar 21 05:47:21 crc kubenswrapper[4580]: I0321 05:47:21.696710 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/crc-debug-wl76t" Mar 21 05:47:22 crc kubenswrapper[4580]: I0321 05:47:22.093945 4580 generic.go:334] "Generic (PLEG): container finished" podID="cf35e99d-286f-4451-a343-77f364433024" containerID="7bb778a6b493265b9180beb82755dad1456dec50ad6c14694f0bfc7e0425d3f8" exitCode=0 Mar 21 05:47:22 crc kubenswrapper[4580]: I0321 05:47:22.095503 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bpk7s/crc-debug-wl76t" event={"ID":"cf35e99d-286f-4451-a343-77f364433024","Type":"ContainerDied","Data":"7bb778a6b493265b9180beb82755dad1456dec50ad6c14694f0bfc7e0425d3f8"} Mar 21 05:47:22 crc kubenswrapper[4580]: I0321 05:47:22.095611 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bpk7s/crc-debug-wl76t" event={"ID":"cf35e99d-286f-4451-a343-77f364433024","Type":"ContainerStarted","Data":"10a0cae7ddd97e0a1252b837042157cfb4339d2983b0203ea6d67f6c7fe37ff3"} Mar 21 05:47:22 crc kubenswrapper[4580]: I0321 05:47:22.605583 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bpk7s/crc-debug-wl76t"] Mar 21 05:47:22 crc kubenswrapper[4580]: I0321 05:47:22.614839 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bpk7s/crc-debug-wl76t"] Mar 21 05:47:23 crc kubenswrapper[4580]: I0321 05:47:23.243244 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/crc-debug-wl76t" Mar 21 05:47:23 crc kubenswrapper[4580]: I0321 05:47:23.310064 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf35e99d-286f-4451-a343-77f364433024-host\") pod \"cf35e99d-286f-4451-a343-77f364433024\" (UID: \"cf35e99d-286f-4451-a343-77f364433024\") " Mar 21 05:47:23 crc kubenswrapper[4580]: I0321 05:47:23.310873 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4rzl\" (UniqueName: \"kubernetes.io/projected/cf35e99d-286f-4451-a343-77f364433024-kube-api-access-v4rzl\") pod \"cf35e99d-286f-4451-a343-77f364433024\" (UID: \"cf35e99d-286f-4451-a343-77f364433024\") " Mar 21 05:47:23 crc kubenswrapper[4580]: I0321 05:47:23.313006 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf35e99d-286f-4451-a343-77f364433024-host" (OuterVolumeSpecName: "host") pod "cf35e99d-286f-4451-a343-77f364433024" (UID: "cf35e99d-286f-4451-a343-77f364433024"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:47:23 crc kubenswrapper[4580]: I0321 05:47:23.318721 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf35e99d-286f-4451-a343-77f364433024-kube-api-access-v4rzl" (OuterVolumeSpecName: "kube-api-access-v4rzl") pod "cf35e99d-286f-4451-a343-77f364433024" (UID: "cf35e99d-286f-4451-a343-77f364433024"). InnerVolumeSpecName "kube-api-access-v4rzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:47:23 crc kubenswrapper[4580]: I0321 05:47:23.413630 4580 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf35e99d-286f-4451-a343-77f364433024-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:47:23 crc kubenswrapper[4580]: I0321 05:47:23.413669 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4rzl\" (UniqueName: \"kubernetes.io/projected/cf35e99d-286f-4451-a343-77f364433024-kube-api-access-v4rzl\") on node \"crc\" DevicePath \"\"" Mar 21 05:47:23 crc kubenswrapper[4580]: I0321 05:47:23.630398 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf35e99d-286f-4451-a343-77f364433024" path="/var/lib/kubelet/pods/cf35e99d-286f-4451-a343-77f364433024/volumes" Mar 21 05:47:23 crc kubenswrapper[4580]: I0321 05:47:23.809872 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bpk7s/crc-debug-lmwq6"] Mar 21 05:47:23 crc kubenswrapper[4580]: E0321 05:47:23.810505 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf35e99d-286f-4451-a343-77f364433024" containerName="container-00" Mar 21 05:47:23 crc kubenswrapper[4580]: I0321 05:47:23.810577 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf35e99d-286f-4451-a343-77f364433024" containerName="container-00" Mar 21 05:47:23 crc kubenswrapper[4580]: I0321 05:47:23.810825 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf35e99d-286f-4451-a343-77f364433024" containerName="container-00" Mar 21 05:47:23 crc kubenswrapper[4580]: I0321 05:47:23.811529 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/crc-debug-lmwq6" Mar 21 05:47:23 crc kubenswrapper[4580]: I0321 05:47:23.924918 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7fzc\" (UniqueName: \"kubernetes.io/projected/a7474afd-5834-452e-822b-4062990ee77e-kube-api-access-h7fzc\") pod \"crc-debug-lmwq6\" (UID: \"a7474afd-5834-452e-822b-4062990ee77e\") " pod="openshift-must-gather-bpk7s/crc-debug-lmwq6" Mar 21 05:47:23 crc kubenswrapper[4580]: I0321 05:47:23.925056 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7474afd-5834-452e-822b-4062990ee77e-host\") pod \"crc-debug-lmwq6\" (UID: \"a7474afd-5834-452e-822b-4062990ee77e\") " pod="openshift-must-gather-bpk7s/crc-debug-lmwq6" Mar 21 05:47:24 crc kubenswrapper[4580]: I0321 05:47:24.026419 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7474afd-5834-452e-822b-4062990ee77e-host\") pod \"crc-debug-lmwq6\" (UID: \"a7474afd-5834-452e-822b-4062990ee77e\") " pod="openshift-must-gather-bpk7s/crc-debug-lmwq6" Mar 21 05:47:24 crc kubenswrapper[4580]: I0321 05:47:24.026549 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7474afd-5834-452e-822b-4062990ee77e-host\") pod \"crc-debug-lmwq6\" (UID: \"a7474afd-5834-452e-822b-4062990ee77e\") " pod="openshift-must-gather-bpk7s/crc-debug-lmwq6" Mar 21 05:47:24 crc kubenswrapper[4580]: I0321 05:47:24.026638 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7fzc\" (UniqueName: \"kubernetes.io/projected/a7474afd-5834-452e-822b-4062990ee77e-kube-api-access-h7fzc\") pod \"crc-debug-lmwq6\" (UID: \"a7474afd-5834-452e-822b-4062990ee77e\") " pod="openshift-must-gather-bpk7s/crc-debug-lmwq6" Mar 21 05:47:24 crc kubenswrapper[4580]: I0321 05:47:24.043240 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7fzc\" (UniqueName: \"kubernetes.io/projected/a7474afd-5834-452e-822b-4062990ee77e-kube-api-access-h7fzc\") pod \"crc-debug-lmwq6\" (UID: \"a7474afd-5834-452e-822b-4062990ee77e\") " pod="openshift-must-gather-bpk7s/crc-debug-lmwq6" Mar 21 05:47:24 crc kubenswrapper[4580]: I0321 05:47:24.113313 4580 scope.go:117] "RemoveContainer" containerID="7bb778a6b493265b9180beb82755dad1456dec50ad6c14694f0bfc7e0425d3f8" Mar 21 05:47:24 crc kubenswrapper[4580]: I0321 05:47:24.113548 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/crc-debug-wl76t" Mar 21 05:47:24 crc kubenswrapper[4580]: I0321 05:47:24.134280 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/crc-debug-lmwq6" Mar 21 05:47:24 crc kubenswrapper[4580]: W0321 05:47:24.192566 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7474afd_5834_452e_822b_4062990ee77e.slice/crio-6047510200c1e4706d5abdb1928acefa0de7f70e5455472893fec0d6c8d36f36 WatchSource:0}: Error finding container 6047510200c1e4706d5abdb1928acefa0de7f70e5455472893fec0d6c8d36f36: Status 404 returned error can't find the container with id 6047510200c1e4706d5abdb1928acefa0de7f70e5455472893fec0d6c8d36f36 Mar 21 05:47:24 crc kubenswrapper[4580]: I0321 05:47:24.617772 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:47:24 crc kubenswrapper[4580]: E0321 05:47:24.618266 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:47:25 crc kubenswrapper[4580]: I0321 05:47:25.127289 4580 generic.go:334] "Generic (PLEG): container finished" podID="a7474afd-5834-452e-822b-4062990ee77e" containerID="343c965422e840c513c4c4caf7509a075d40c1028e9334b3516b61fef53e8873" exitCode=0 Mar 21 05:47:25 crc kubenswrapper[4580]: I0321 05:47:25.127333 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bpk7s/crc-debug-lmwq6" event={"ID":"a7474afd-5834-452e-822b-4062990ee77e","Type":"ContainerDied","Data":"343c965422e840c513c4c4caf7509a075d40c1028e9334b3516b61fef53e8873"} Mar 21 05:47:25 crc kubenswrapper[4580]: I0321 05:47:25.127357 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bpk7s/crc-debug-lmwq6" event={"ID":"a7474afd-5834-452e-822b-4062990ee77e","Type":"ContainerStarted","Data":"6047510200c1e4706d5abdb1928acefa0de7f70e5455472893fec0d6c8d36f36"} Mar 21 05:47:25 crc kubenswrapper[4580]: I0321 05:47:25.171273 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bpk7s/crc-debug-lmwq6"] Mar 21 05:47:25 crc kubenswrapper[4580]: I0321 05:47:25.178929 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bpk7s/crc-debug-lmwq6"] Mar 21 05:47:26 crc kubenswrapper[4580]: I0321 05:47:26.254483 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/crc-debug-lmwq6" Mar 21 05:47:26 crc kubenswrapper[4580]: I0321 05:47:26.393061 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7fzc\" (UniqueName: \"kubernetes.io/projected/a7474afd-5834-452e-822b-4062990ee77e-kube-api-access-h7fzc\") pod \"a7474afd-5834-452e-822b-4062990ee77e\" (UID: \"a7474afd-5834-452e-822b-4062990ee77e\") " Mar 21 05:47:26 crc kubenswrapper[4580]: I0321 05:47:26.393286 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7474afd-5834-452e-822b-4062990ee77e-host\") pod \"a7474afd-5834-452e-822b-4062990ee77e\" (UID: \"a7474afd-5834-452e-822b-4062990ee77e\") " Mar 21 05:47:26 crc kubenswrapper[4580]: I0321 05:47:26.393408 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7474afd-5834-452e-822b-4062990ee77e-host" (OuterVolumeSpecName: "host") pod "a7474afd-5834-452e-822b-4062990ee77e" (UID: "a7474afd-5834-452e-822b-4062990ee77e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:47:26 crc kubenswrapper[4580]: I0321 05:47:26.393832 4580 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7474afd-5834-452e-822b-4062990ee77e-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:47:26 crc kubenswrapper[4580]: I0321 05:47:26.415923 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7474afd-5834-452e-822b-4062990ee77e-kube-api-access-h7fzc" (OuterVolumeSpecName: "kube-api-access-h7fzc") pod "a7474afd-5834-452e-822b-4062990ee77e" (UID: "a7474afd-5834-452e-822b-4062990ee77e"). InnerVolumeSpecName "kube-api-access-h7fzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:47:26 crc kubenswrapper[4580]: I0321 05:47:26.495840 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7fzc\" (UniqueName: \"kubernetes.io/projected/a7474afd-5834-452e-822b-4062990ee77e-kube-api-access-h7fzc\") on node \"crc\" DevicePath \"\"" Mar 21 05:47:27 crc kubenswrapper[4580]: I0321 05:47:27.148199 4580 scope.go:117] "RemoveContainer" containerID="343c965422e840c513c4c4caf7509a075d40c1028e9334b3516b61fef53e8873" Mar 21 05:47:27 crc kubenswrapper[4580]: I0321 05:47:27.148274 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/crc-debug-lmwq6" Mar 21 05:47:27 crc kubenswrapper[4580]: I0321 05:47:27.640028 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7474afd-5834-452e-822b-4062990ee77e" path="/var/lib/kubelet/pods/a7474afd-5834-452e-822b-4062990ee77e/volumes" Mar 21 05:47:35 crc kubenswrapper[4580]: I0321 05:47:35.623561 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:47:35 crc kubenswrapper[4580]: E0321 05:47:35.624628 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.024549 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rc6d6"] Mar 21 05:47:45 crc kubenswrapper[4580]: E0321 05:47:45.026428 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7474afd-5834-452e-822b-4062990ee77e" containerName="container-00" Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.026456 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7474afd-5834-452e-822b-4062990ee77e" containerName="container-00" Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.026855 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7474afd-5834-452e-822b-4062990ee77e" containerName="container-00" Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.029501 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.037740 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rc6d6"] Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.090233 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b89c8-8fd1-4c72-b487-199450439b10-catalog-content\") pod \"redhat-operators-rc6d6\" (UID: \"9b3b89c8-8fd1-4c72-b487-199450439b10\") " pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.090317 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77xlt\" (UniqueName: \"kubernetes.io/projected/9b3b89c8-8fd1-4c72-b487-199450439b10-kube-api-access-77xlt\") pod \"redhat-operators-rc6d6\" (UID: \"9b3b89c8-8fd1-4c72-b487-199450439b10\") " pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.090396 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b3b89c8-8fd1-4c72-b487-199450439b10-utilities\") pod \"redhat-operators-rc6d6\" (UID: \"9b3b89c8-8fd1-4c72-b487-199450439b10\") " pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.192891 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77xlt\" (UniqueName: \"kubernetes.io/projected/9b3b89c8-8fd1-4c72-b487-199450439b10-kube-api-access-77xlt\") pod \"redhat-operators-rc6d6\" (UID: \"9b3b89c8-8fd1-4c72-b487-199450439b10\") " pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.193083 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b3b89c8-8fd1-4c72-b487-199450439b10-utilities\") pod \"redhat-operators-rc6d6\" (UID: \"9b3b89c8-8fd1-4c72-b487-199450439b10\") " pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.193251 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b89c8-8fd1-4c72-b487-199450439b10-catalog-content\") pod \"redhat-operators-rc6d6\" (UID: \"9b3b89c8-8fd1-4c72-b487-199450439b10\") " pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.193775 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b3b89c8-8fd1-4c72-b487-199450439b10-utilities\") pod \"redhat-operators-rc6d6\" (UID: \"9b3b89c8-8fd1-4c72-b487-199450439b10\") " pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.194042 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b89c8-8fd1-4c72-b487-199450439b10-catalog-content\") pod \"redhat-operators-rc6d6\" (UID: \"9b3b89c8-8fd1-4c72-b487-199450439b10\") " pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.220312 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77xlt\" (UniqueName: \"kubernetes.io/projected/9b3b89c8-8fd1-4c72-b487-199450439b10-kube-api-access-77xlt\") pod \"redhat-operators-rc6d6\" (UID: \"9b3b89c8-8fd1-4c72-b487-199450439b10\") " pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.419847 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:47:45 crc kubenswrapper[4580]: I0321 05:47:45.958135 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rc6d6"] Mar 21 05:47:46 crc kubenswrapper[4580]: I0321 05:47:46.342294 4580 generic.go:334] "Generic (PLEG): container finished" podID="9b3b89c8-8fd1-4c72-b487-199450439b10" containerID="6648488027b56aa5842a89bb093e874a3f7fdc87c2975e4b9593c118d7987fec" exitCode=0 Mar 21 05:47:46 crc kubenswrapper[4580]: I0321 05:47:46.342588 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rc6d6" event={"ID":"9b3b89c8-8fd1-4c72-b487-199450439b10","Type":"ContainerDied","Data":"6648488027b56aa5842a89bb093e874a3f7fdc87c2975e4b9593c118d7987fec"} Mar 21 05:47:46 crc kubenswrapper[4580]: I0321 05:47:46.342621 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rc6d6" event={"ID":"9b3b89c8-8fd1-4c72-b487-199450439b10","Type":"ContainerStarted","Data":"f6aeb32e0a608969979d4e5b29d95eb9e17078dee24dc77e32d8b4b67e40ca5b"} Mar 21 05:47:47 crc kubenswrapper[4580]: I0321 05:47:47.353342 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rc6d6" event={"ID":"9b3b89c8-8fd1-4c72-b487-199450439b10","Type":"ContainerStarted","Data":"84099185e9858dc85e9a1e460d0f25784a358d4ce238403b63d13033c3033c09"} Mar 21 05:47:49 crc kubenswrapper[4580]: I0321 05:47:49.290143 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d486fc764-m7r7b_c225fecd-c259-40cb-898c-78dc724d1db8/barbican-api/0.log" Mar 21 05:47:49 crc kubenswrapper[4580]: I0321 05:47:49.454948 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d486fc764-m7r7b_c225fecd-c259-40cb-898c-78dc724d1db8/barbican-api-log/0.log" Mar 21 05:47:49 crc kubenswrapper[4580]: I0321 05:47:49.564115 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cfb6cbb9d-ln66c_0f4b4bf3-0508-4021-916b-97694fe670ff/barbican-keystone-listener/0.log" Mar 21 05:47:49 crc kubenswrapper[4580]: I0321 05:47:49.618746 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:47:49 crc kubenswrapper[4580]: E0321 05:47:49.619243 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:47:49 crc kubenswrapper[4580]: I0321 05:47:49.844217 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-787f545779-9db4b_d79bd04a-35d0-48ab-883f-982e3129d435/barbican-worker/0.log" Mar 21 05:47:49 crc kubenswrapper[4580]: I0321 05:47:49.863369 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cfb6cbb9d-ln66c_0f4b4bf3-0508-4021-916b-97694fe670ff/barbican-keystone-listener-log/0.log" Mar 21 05:47:50 crc kubenswrapper[4580]: I0321 05:47:50.007141 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-787f545779-9db4b_d79bd04a-35d0-48ab-883f-982e3129d435/barbican-worker-log/0.log" Mar 21 05:47:50 crc kubenswrapper[4580]: I0321 05:47:50.235108 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x_ddfb2a5d-1386-4dac-aee6-316bce48c76b/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:47:50 crc kubenswrapper[4580]: I0321 05:47:50.420610 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c77c9b9f-3e73-4cef-9e10-39bfef8357b5/ceilometer-notification-agent/0.log" Mar 21 05:47:50 crc kubenswrapper[4580]: I0321 05:47:50.423738 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c77c9b9f-3e73-4cef-9e10-39bfef8357b5/ceilometer-central-agent/0.log" Mar 21 05:47:50 crc kubenswrapper[4580]: I0321 05:47:50.498195 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c77c9b9f-3e73-4cef-9e10-39bfef8357b5/sg-core/0.log" Mar 21 05:47:50 crc kubenswrapper[4580]: I0321 05:47:50.507087 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c77c9b9f-3e73-4cef-9e10-39bfef8357b5/proxy-httpd/0.log" Mar 21 05:47:50 crc kubenswrapper[4580]: I0321 05:47:50.797341 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_18848c19-7735-494d-babb-32e04c8ef382/cinder-api-log/0.log" Mar 21 05:47:50 crc kubenswrapper[4580]: I0321 05:47:50.853162 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_18848c19-7735-494d-babb-32e04c8ef382/cinder-api/0.log" Mar 21 05:47:51 crc kubenswrapper[4580]: I0321 05:47:51.009858 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a6a3d4de-9969-48f3-9f1a-9f273f81050a/probe/0.log" Mar 21 05:47:51 crc kubenswrapper[4580]: I0321 05:47:51.117652 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a6a3d4de-9969-48f3-9f1a-9f273f81050a/cinder-scheduler/0.log" Mar 21 05:47:51 crc kubenswrapper[4580]: I0321 05:47:51.269543 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-4chpn_5014a479-6112-4f5c-9824-db4736d248f4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:47:51 crc kubenswrapper[4580]: I0321 05:47:51.572354 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6cd9bffc9-mrwrr_0a878571-91e7-486e-8258-fc3298a5e03f/init/0.log" Mar 21 05:47:51 crc kubenswrapper[4580]: I0321 05:47:51.665326 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-gck9n_136b85ae-b1b7-46cf-a8fa-059f29999f31/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:47:51 crc kubenswrapper[4580]: I0321 05:47:51.840452 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6cd9bffc9-mrwrr_0a878571-91e7-486e-8258-fc3298a5e03f/init/0.log" Mar 21 05:47:52 crc kubenswrapper[4580]: I0321 05:47:52.020368 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6cd9bffc9-mrwrr_0a878571-91e7-486e-8258-fc3298a5e03f/dnsmasq-dns/0.log" Mar 21 05:47:52 crc kubenswrapper[4580]: I0321 05:47:52.123514 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-fl44d_73c35bcd-08ba-44f4-96c4-4d29bcf84b5f/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:47:52 crc kubenswrapper[4580]: I0321 05:47:52.199105 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_551ce5a9-fc21-4f0c-9c38-d53b829c5979/glance-httpd/0.log" Mar 21 05:47:52 crc kubenswrapper[4580]: I0321 05:47:52.309000 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_551ce5a9-fc21-4f0c-9c38-d53b829c5979/glance-log/0.log" Mar 21 05:47:52 crc kubenswrapper[4580]: I0321 05:47:52.412764 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f151509-94d2-4991-89f7-c7757d14b867/glance-log/0.log" Mar 21 05:47:52 crc kubenswrapper[4580]: I0321 05:47:52.438968 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f151509-94d2-4991-89f7-c7757d14b867/glance-httpd/0.log" Mar 21 05:47:52 crc kubenswrapper[4580]: I0321 05:47:52.642936 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-67655f8b6-mbx6n_a03ce0fa-f7e8-4b48-bbea-95807f14dd26/horizon/4.log" Mar 21 05:47:52 crc kubenswrapper[4580]: I0321 05:47:52.759930 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-67655f8b6-mbx6n_a03ce0fa-f7e8-4b48-bbea-95807f14dd26/horizon/3.log" Mar 21 05:47:53 crc kubenswrapper[4580]: I0321 05:47:53.044557 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-7skn5_72b35ded-99db-471e-b265-5e1e0467af49/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:47:53 crc kubenswrapper[4580]: I0321 05:47:53.065238 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-67655f8b6-mbx6n_a03ce0fa-f7e8-4b48-bbea-95807f14dd26/horizon-log/0.log" Mar 21 05:47:53 crc kubenswrapper[4580]: I0321 05:47:53.714879 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ca4bc346-fdf8-4e43-8bbb-ea6c80333c43/kube-state-metrics/0.log" Mar 21 05:47:53 crc kubenswrapper[4580]: I0321 05:47:53.793554 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-5b8sc_42a67be6-2662-40e1-a94d-0b7fa55c1bc0/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:47:53 crc kubenswrapper[4580]: I0321 05:47:53.857799 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6d45658b5d-dfjj4_19491f31-c899-4d84-a81b-262d0660b2c1/keystone-api/0.log" Mar 21 05:47:54 crc kubenswrapper[4580]: I0321 05:47:54.410623 4580 generic.go:334] "Generic (PLEG): container finished" podID="9b3b89c8-8fd1-4c72-b487-199450439b10" containerID="84099185e9858dc85e9a1e460d0f25784a358d4ce238403b63d13033c3033c09" exitCode=0 Mar 21 05:47:54 crc kubenswrapper[4580]: I0321 05:47:54.411004 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rc6d6" event={"ID":"9b3b89c8-8fd1-4c72-b487-199450439b10","Type":"ContainerDied","Data":"84099185e9858dc85e9a1e460d0f25784a358d4ce238403b63d13033c3033c09"} Mar 21 05:47:54 crc kubenswrapper[4580]: I0321 05:47:54.482224 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6cd755485-pmnqc_0804de84-fb1f-40cf-af99-b67d2eb64fc4/neutron-httpd/0.log" Mar 21 05:47:54 crc kubenswrapper[4580]: I0321 05:47:54.622997 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6cd755485-pmnqc_0804de84-fb1f-40cf-af99-b67d2eb64fc4/neutron-api/0.log" Mar 21 05:47:55 crc kubenswrapper[4580]: I0321 05:47:55.150883 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj_24785e2f-2d74-4dd1-97dd-10e58843652e/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:47:55 crc kubenswrapper[4580]: I0321 05:47:55.429133 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rc6d6" event={"ID":"9b3b89c8-8fd1-4c72-b487-199450439b10","Type":"ContainerStarted","Data":"55dd97cb36d6f5e5a8781c8645c49f9610b0066e8526ba1e74d03e8165b8e919"} Mar 21 05:47:55 crc kubenswrapper[4580]: I0321 05:47:55.466515 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rc6d6" podStartSLOduration=3.01674405 podStartE2EDuration="11.466491759s" podCreationTimestamp="2026-03-21 05:47:44 +0000 UTC" firstStartedPulling="2026-03-21 05:47:46.348022023 +0000 UTC m=+3371.430605651" lastFinishedPulling="2026-03-21 05:47:54.797769732 +0000 UTC m=+3379.880353360" observedRunningTime="2026-03-21 05:47:55.44831406 +0000 UTC m=+3380.530897708" watchObservedRunningTime="2026-03-21 05:47:55.466491759 +0000 UTC m=+3380.549075407" Mar 21 05:47:55 crc kubenswrapper[4580]: I0321 05:47:55.483092 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ae93ee16-d710-434d-b070-65215d559dfb/nova-api-log/0.log" Mar 21 05:47:55 crc kubenswrapper[4580]: I0321 05:47:55.652201 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv_e355e210-9abe-4bdf-bcbf-70e95e437482/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:47:55 crc kubenswrapper[4580]: I0321 05:47:55.674946 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ae93ee16-d710-434d-b070-65215d559dfb/nova-api-api/0.log" Mar 21 05:47:55 crc kubenswrapper[4580]: I0321 05:47:55.790500 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_dc118c0b-9b79-4e70-a775-a437c1b83b2c/nova-cell0-conductor-conductor/0.log" Mar 21 05:47:56 crc kubenswrapper[4580]: I0321 05:47:56.114816 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_fcc0c177-5dea-46ab-9eb9-aa66a23d909f/nova-cell1-conductor-conductor/0.log" Mar 21 05:47:56 crc kubenswrapper[4580]: I0321 05:47:56.188229 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b7bd64a0-ec65-4f8c-841c-ca1950434439/nova-cell1-novncproxy-novncproxy/0.log" Mar 21 05:47:56 crc kubenswrapper[4580]: I0321 05:47:56.635742 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_984d329d-aa14-46fb-9c9f-c5f9eb415f73/nova-metadata-log/0.log" Mar 21 05:47:57 crc kubenswrapper[4580]: I0321 05:47:57.037473 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_984d329d-aa14-46fb-9c9f-c5f9eb415f73/nova-metadata-metadata/0.log" Mar 21 05:47:57 crc kubenswrapper[4580]: I0321 05:47:57.062370 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6b628f21-06f6-4838-805f-b0d25851ac35/nova-scheduler-scheduler/0.log" Mar 21 05:47:57 crc kubenswrapper[4580]: I0321 05:47:57.279248 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2da281c0-51d3-4264-8924-83dbc85ecbf0/mysql-bootstrap/0.log" Mar 21 05:47:57 crc kubenswrapper[4580]: I0321 05:47:57.537334 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2da281c0-51d3-4264-8924-83dbc85ecbf0/mysql-bootstrap/0.log" Mar 21 05:47:57 crc kubenswrapper[4580]: I0321 05:47:57.558261 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-gxx9h_bf805790-d6ce-495d-8d85-dd7cf68b4bf3/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:47:57 crc kubenswrapper[4580]: I0321 05:47:57.688502 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2da281c0-51d3-4264-8924-83dbc85ecbf0/galera/0.log" Mar 21 05:47:57 crc kubenswrapper[4580]: I0321 05:47:57.953374 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b4f4841a-f9ee-4d9d-b756-77cabd20363a/mysql-bootstrap/0.log" Mar 21 05:47:58 crc kubenswrapper[4580]: I0321 05:47:58.207960 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b4f4841a-f9ee-4d9d-b756-77cabd20363a/galera/0.log" Mar 21 05:47:58 crc kubenswrapper[4580]: I0321 05:47:58.216913 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b4f4841a-f9ee-4d9d-b756-77cabd20363a/mysql-bootstrap/0.log" Mar 21 05:47:58 crc kubenswrapper[4580]: I0321 05:47:58.256727 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_286ff68a-a9d7-4592-9146-f9537c8cf329/openstackclient/0.log" Mar 21 05:47:58 crc kubenswrapper[4580]: I0321 05:47:58.971631 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-vpk6m_cc8eda41-b1d2-4f48-ac6e-59b7856a0917/openstack-network-exporter/0.log" Mar 21 05:47:59 crc kubenswrapper[4580]: I0321 05:47:59.009129 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-jjv5q_15016044-062f-44bc-8278-97a43b709083/ovn-controller/0.log" Mar 21 05:47:59 crc kubenswrapper[4580]: I0321 05:47:59.275004 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tqfdg_893ab010-283a-4331-834a-05586719a352/ovsdb-server-init/0.log" Mar 21 05:47:59 crc kubenswrapper[4580]: I0321 05:47:59.648710 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tqfdg_893ab010-283a-4331-834a-05586719a352/ovs-vswitchd/0.log" Mar 21 05:47:59 crc kubenswrapper[4580]: I0321 05:47:59.660314 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tqfdg_893ab010-283a-4331-834a-05586719a352/ovsdb-server/0.log" Mar 21 05:47:59 crc kubenswrapper[4580]: I0321 05:47:59.707672 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tqfdg_893ab010-283a-4331-834a-05586719a352/ovsdb-server-init/0.log" Mar 21 05:47:59 crc kubenswrapper[4580]: I0321 05:47:59.993292 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_69db8b67-aa51-41d9-8088-dba10b9bdd0d/openstack-network-exporter/0.log" Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.080747 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_69db8b67-aa51-41d9-8088-dba10b9bdd0d/ovn-northd/0.log" Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.145467 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-j2klj_fe57de6b-1ee3-4bdb-91b8-d81369a7fc72/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.244158 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567868-vmcb5"] Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.245996 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567868-vmcb5" Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.250540 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.250676 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.251250 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.256488 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567868-vmcb5"] Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.316968 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zqw4\" (UniqueName: \"kubernetes.io/projected/9730819c-907e-4985-858b-dbed32715065-kube-api-access-9zqw4\") pod \"auto-csr-approver-29567868-vmcb5\" (UID: \"9730819c-907e-4985-858b-dbed32715065\") " pod="openshift-infra/auto-csr-approver-29567868-vmcb5" Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.372177 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_358c9476-8608-43e6-9912-6be4fb3f2ba8/openstack-network-exporter/0.log" Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.418430 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zqw4\" (UniqueName: \"kubernetes.io/projected/9730819c-907e-4985-858b-dbed32715065-kube-api-access-9zqw4\") pod \"auto-csr-approver-29567868-vmcb5\" (UID: \"9730819c-907e-4985-858b-dbed32715065\") " pod="openshift-infra/auto-csr-approver-29567868-vmcb5" Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.423163 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_358c9476-8608-43e6-9912-6be4fb3f2ba8/ovsdbserver-nb/0.log" Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.472439 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zqw4\" (UniqueName: \"kubernetes.io/projected/9730819c-907e-4985-858b-dbed32715065-kube-api-access-9zqw4\") pod \"auto-csr-approver-29567868-vmcb5\" (UID: \"9730819c-907e-4985-858b-dbed32715065\") " pod="openshift-infra/auto-csr-approver-29567868-vmcb5" Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.585218 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567868-vmcb5" Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.707447 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_514b5967-88ad-43e2-aa38-88551fba381d/openstack-network-exporter/0.log" Mar 21 05:48:00 crc kubenswrapper[4580]: I0321 05:48:00.897234 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_514b5967-88ad-43e2-aa38-88551fba381d/ovsdbserver-sb/0.log" Mar 21 05:48:01 crc kubenswrapper[4580]: I0321 05:48:01.051064 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67f74b898d-dtzvd_3cbb1901-c5ee-4f46-aa6d-ac31372a9b83/placement-api/0.log" Mar 21 05:48:01 crc kubenswrapper[4580]: I0321 05:48:01.076587 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67f74b898d-dtzvd_3cbb1901-c5ee-4f46-aa6d-ac31372a9b83/placement-log/0.log" Mar 21 05:48:01 crc kubenswrapper[4580]: I0321 05:48:01.143872 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7619a3e5-e696-412d-8550-c8c30660eacd/setup-container/0.log" Mar 21 05:48:01 crc kubenswrapper[4580]: I0321 05:48:01.207025 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567868-vmcb5"] Mar 21 05:48:01 crc kubenswrapper[4580]: W0321 05:48:01.231359 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9730819c_907e_4985_858b_dbed32715065.slice/crio-6c6f51b3086ec0dbd79bd836097bb87fc89573ed9173f1ac335df06e73b430e8 WatchSource:0}: Error finding container 6c6f51b3086ec0dbd79bd836097bb87fc89573ed9173f1ac335df06e73b430e8: Status 404 returned error can't find the container with id 6c6f51b3086ec0dbd79bd836097bb87fc89573ed9173f1ac335df06e73b430e8 Mar 21 05:48:01 crc kubenswrapper[4580]: I0321 05:48:01.524434 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567868-vmcb5" event={"ID":"9730819c-907e-4985-858b-dbed32715065","Type":"ContainerStarted","Data":"6c6f51b3086ec0dbd79bd836097bb87fc89573ed9173f1ac335df06e73b430e8"} Mar 21 05:48:01 crc kubenswrapper[4580]: I0321 05:48:01.687539 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7619a3e5-e696-412d-8550-c8c30660eacd/rabbitmq/0.log" Mar 21 05:48:01 crc kubenswrapper[4580]: I0321 05:48:01.719629 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7619a3e5-e696-412d-8550-c8c30660eacd/setup-container/0.log" Mar 21 05:48:01 crc kubenswrapper[4580]: I0321 05:48:01.791022 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_364da597-ba18-4d63-b1be-1d925e603515/setup-container/0.log" Mar 21 05:48:02 crc kubenswrapper[4580]: I0321 05:48:02.151992 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_364da597-ba18-4d63-b1be-1d925e603515/rabbitmq/0.log" Mar 21 05:48:02 crc kubenswrapper[4580]: I0321 05:48:02.192188 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn_bb5e8570-68a2-47c9-bd31-4be0389bd713/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:02 crc kubenswrapper[4580]: I0321 05:48:02.242636 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_364da597-ba18-4d63-b1be-1d925e603515/setup-container/0.log" Mar 21 05:48:02 crc kubenswrapper[4580]: I0321 05:48:02.533109 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-dp8qx_251c60b9-f972-4aec-85af-f00d48e21662/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:02 crc kubenswrapper[4580]: I0321 05:48:02.534653 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567868-vmcb5" event={"ID":"9730819c-907e-4985-858b-dbed32715065","Type":"ContainerStarted","Data":"c630e88ceb7733f4e1e827d58edcbf064f7c68dd8a45360ceaca3f231c525bdf"} Mar 21 05:48:02 crc kubenswrapper[4580]: I0321 05:48:02.553524 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567868-vmcb5" podStartSLOduration=1.675385952 podStartE2EDuration="2.553506935s" podCreationTimestamp="2026-03-21 05:48:00 +0000 UTC" firstStartedPulling="2026-03-21 05:48:01.25287886 +0000 UTC m=+3386.335462478" lastFinishedPulling="2026-03-21 05:48:02.130999833 +0000 UTC m=+3387.213583461" observedRunningTime="2026-03-21 05:48:02.547495263 +0000 UTC m=+3387.630078901" watchObservedRunningTime="2026-03-21 05:48:02.553506935 +0000 UTC m=+3387.636090563" Mar 21 05:48:02 crc kubenswrapper[4580]: I0321 05:48:02.560122 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h_668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:02 crc kubenswrapper[4580]: I0321 05:48:02.949937 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-jcqfr_30665d01-e41e-4e5e-ad25-f4430eb5866a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:03 crc kubenswrapper[4580]: I0321 05:48:03.017927 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-cxplm_b448b2a2-1171-4d9a-b28f-c0d8805134df/ssh-known-hosts-edpm-deployment/0.log" Mar 21 05:48:03 crc kubenswrapper[4580]: I0321 05:48:03.283609 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6dbb667f95-c5g4x_21065819-f94d-4cc9-925f-c4be4eeee0d7/proxy-httpd/0.log" Mar 21 05:48:03 crc kubenswrapper[4580]: I0321 05:48:03.308000 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6dbb667f95-c5g4x_21065819-f94d-4cc9-925f-c4be4eeee0d7/proxy-server/0.log" Mar 21 05:48:03 crc kubenswrapper[4580]: I0321 05:48:03.549396 4580 generic.go:334] "Generic (PLEG): container finished" podID="9730819c-907e-4985-858b-dbed32715065" containerID="c630e88ceb7733f4e1e827d58edcbf064f7c68dd8a45360ceaca3f231c525bdf" exitCode=0 Mar 21 05:48:03 crc kubenswrapper[4580]: I0321 05:48:03.549433 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567868-vmcb5" event={"ID":"9730819c-907e-4985-858b-dbed32715065","Type":"ContainerDied","Data":"c630e88ceb7733f4e1e827d58edcbf064f7c68dd8a45360ceaca3f231c525bdf"} Mar 21 05:48:03 crc kubenswrapper[4580]: I0321 05:48:03.552703 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-kkh4r_3d23c194-d398-4264-8726-c75316c85eff/swift-ring-rebalance/0.log" Mar 21 05:48:03 crc kubenswrapper[4580]: I0321 05:48:03.619567 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:48:03 crc kubenswrapper[4580]: E0321 05:48:03.619829 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:48:03 crc kubenswrapper[4580]: I0321 05:48:03.728389 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/account-auditor/0.log" Mar 21 05:48:03 crc kubenswrapper[4580]: I0321 05:48:03.781859 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/account-reaper/0.log" Mar 21 05:48:03 crc kubenswrapper[4580]: I0321 05:48:03.830852 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/account-replicator/0.log" Mar 21 05:48:03 crc kubenswrapper[4580]: I0321 05:48:03.965949 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/account-server/0.log" Mar 21 05:48:04 crc kubenswrapper[4580]: I0321 05:48:04.046145 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/container-auditor/0.log" Mar 21 05:48:04 crc kubenswrapper[4580]: I0321 05:48:04.088315 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/container-replicator/0.log" Mar 21 05:48:04 crc kubenswrapper[4580]: I0321 05:48:04.194441 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/container-server/0.log" Mar 21 05:48:04 crc kubenswrapper[4580]: I0321 05:48:04.306249 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/object-auditor/0.log" Mar 21 05:48:04 crc kubenswrapper[4580]: I0321 05:48:04.330208 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/container-updater/0.log" Mar 21 05:48:04 crc kubenswrapper[4580]: I0321 05:48:04.418293 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/object-expirer/0.log" Mar 21 05:48:04 crc kubenswrapper[4580]: I0321 05:48:04.432599 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/object-replicator/0.log" Mar 21 05:48:04 crc kubenswrapper[4580]: I0321 05:48:04.611632 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/object-updater/0.log" Mar 21 05:48:04 crc kubenswrapper[4580]: I0321 05:48:04.635884 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/object-server/0.log" Mar 21 05:48:04 crc kubenswrapper[4580]: I0321 05:48:04.698067 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/rsync/0.log" Mar 21 05:48:04 crc kubenswrapper[4580]: I0321 05:48:04.717818 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/swift-recon-cron/0.log" Mar 21 05:48:04 crc kubenswrapper[4580]: I0321 05:48:04.981462 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567868-vmcb5" Mar 21 05:48:05 crc kubenswrapper[4580]: I0321 05:48:05.120289 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_c692d589-bfb1-449b-91ff-8517954bc204/tempest-tests-tempest-tests-runner/0.log" Mar 21 05:48:05 crc kubenswrapper[4580]: I0321 05:48:05.138444 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zqw4\" (UniqueName: \"kubernetes.io/projected/9730819c-907e-4985-858b-dbed32715065-kube-api-access-9zqw4\") pod \"9730819c-907e-4985-858b-dbed32715065\" (UID: \"9730819c-907e-4985-858b-dbed32715065\") " Mar 21 05:48:05 crc kubenswrapper[4580]: I0321 05:48:05.156998 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9730819c-907e-4985-858b-dbed32715065-kube-api-access-9zqw4" (OuterVolumeSpecName: "kube-api-access-9zqw4") pod "9730819c-907e-4985-858b-dbed32715065" (UID: "9730819c-907e-4985-858b-dbed32715065"). InnerVolumeSpecName "kube-api-access-9zqw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:48:05 crc kubenswrapper[4580]: I0321 05:48:05.244461 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zqw4\" (UniqueName: \"kubernetes.io/projected/9730819c-907e-4985-858b-dbed32715065-kube-api-access-9zqw4\") on node \"crc\" DevicePath \"\"" Mar 21 05:48:05 crc kubenswrapper[4580]: I0321 05:48:05.324289 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_25fd8fca-3d1d-4c2e-af01-c5ca004814fd/test-operator-logs-container/0.log" Mar 21 05:48:05 crc kubenswrapper[4580]: I0321 05:48:05.420055 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:48:05 crc kubenswrapper[4580]: I0321 05:48:05.420098 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:48:05 crc kubenswrapper[4580]: I0321 05:48:05.569502 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567868-vmcb5" event={"ID":"9730819c-907e-4985-858b-dbed32715065","Type":"ContainerDied","Data":"6c6f51b3086ec0dbd79bd836097bb87fc89573ed9173f1ac335df06e73b430e8"} Mar 21 05:48:05 crc kubenswrapper[4580]: I0321 05:48:05.569541 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c6f51b3086ec0dbd79bd836097bb87fc89573ed9173f1ac335df06e73b430e8" Mar 21 05:48:05 crc kubenswrapper[4580]: I0321 05:48:05.569630 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567868-vmcb5" Mar 21 05:48:05 crc kubenswrapper[4580]: I0321 05:48:05.589803 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2_01313952-673b-45c9-b24b-0317ed817834/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:05 crc kubenswrapper[4580]: I0321 05:48:05.643300 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567862-rwqbg"] Mar 21 05:48:05 crc kubenswrapper[4580]: I0321 05:48:05.653414 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567862-rwqbg"] Mar 21 05:48:06 crc kubenswrapper[4580]: I0321 05:48:06.008093 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-bc666_53375dd9-0a2b-413f-8fa2-1ebd8d63df42/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:06 crc kubenswrapper[4580]: I0321 05:48:06.478438 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rc6d6" podUID="9b3b89c8-8fd1-4c72-b487-199450439b10" containerName="registry-server" probeResult="failure" output=< Mar 21 05:48:06 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:48:06 crc kubenswrapper[4580]: > Mar 21 05:48:07 crc kubenswrapper[4580]: I0321 05:48:07.653399 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e06aaf-09aa-47d0-823b-05c80face2d4" path="/var/lib/kubelet/pods/e3e06aaf-09aa-47d0-823b-05c80face2d4/volumes" Mar 21 05:48:12 crc kubenswrapper[4580]: I0321 05:48:12.244920 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_226921bf-412a-4dc6-a722-3fcf5ecc7fdc/memcached/0.log" Mar 21 05:48:16 crc kubenswrapper[4580]: I0321 05:48:16.471815 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rc6d6" podUID="9b3b89c8-8fd1-4c72-b487-199450439b10" containerName="registry-server" probeResult="failure" output=< Mar 21 05:48:16 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:48:16 crc kubenswrapper[4580]: > Mar 21 05:48:16 crc kubenswrapper[4580]: I0321 05:48:16.618897 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:48:16 crc kubenswrapper[4580]: E0321 05:48:16.619142 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:48:26 crc kubenswrapper[4580]: I0321 05:48:26.475689 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rc6d6" podUID="9b3b89c8-8fd1-4c72-b487-199450439b10" containerName="registry-server" probeResult="failure" output=< Mar 21 05:48:26 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:48:26 crc kubenswrapper[4580]: > Mar 21 05:48:28 crc kubenswrapper[4580]: I0321 05:48:28.618449 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:48:28 crc kubenswrapper[4580]: E0321 05:48:28.618970 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:48:33 crc kubenswrapper[4580]: I0321 05:48:33.965835 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6cc65c69fc-lkjq4_a96026f1-4dcb-483a-83da-aecc72e7590c/manager/0.log" Mar 21 05:48:34 crc kubenswrapper[4580]: I0321 05:48:34.310344 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4_1afb6781-0e31-4285-aac3-f6ad107c14e5/util/0.log" Mar 21 05:48:34 crc kubenswrapper[4580]: I0321 05:48:34.620461 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4_1afb6781-0e31-4285-aac3-f6ad107c14e5/util/0.log" Mar 21 05:48:34 crc kubenswrapper[4580]: I0321 05:48:34.681975 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4_1afb6781-0e31-4285-aac3-f6ad107c14e5/pull/0.log" Mar 21 05:48:34 crc kubenswrapper[4580]: I0321 05:48:34.888402 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4_1afb6781-0e31-4285-aac3-f6ad107c14e5/pull/0.log" Mar 21 05:48:35 crc kubenswrapper[4580]: I0321 05:48:35.068507 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6d77645966-vxzhk_5522a0a6-b385-4bf6-990c-5a07561257b0/manager/0.log" Mar 21 05:48:35 crc kubenswrapper[4580]: I0321 05:48:35.128116 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4_1afb6781-0e31-4285-aac3-f6ad107c14e5/util/0.log" Mar 21 05:48:35 crc kubenswrapper[4580]: I0321 05:48:35.200003 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4_1afb6781-0e31-4285-aac3-f6ad107c14e5/pull/0.log" Mar 21 05:48:35 crc kubenswrapper[4580]: I0321 05:48:35.349253 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4_1afb6781-0e31-4285-aac3-f6ad107c14e5/extract/0.log" Mar 21 05:48:35 crc kubenswrapper[4580]: I0321 05:48:35.476551 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:48:35 crc kubenswrapper[4580]: I0321 05:48:35.533556 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:48:35 crc kubenswrapper[4580]: I0321 05:48:35.599529 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7d559dcdbd-gqlhn_fad28507-ca7b-4452-b392-f0b68e1f9d64/manager/0.log" Mar 21 05:48:35 crc kubenswrapper[4580]: I0321 05:48:35.727820 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rc6d6"] Mar 21 05:48:35 crc kubenswrapper[4580]: I0321 05:48:35.886274 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-66dd9d474d-kw6px_4faee52b-73ab-41d7-a319-33eb67e1aa30/manager/0.log" Mar 21 05:48:36 crc kubenswrapper[4580]: I0321 05:48:36.151094 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-64dc66d669-lk8d6_127bc03d-748e-4919-97f8-6f66ab3e2a8a/manager/0.log" Mar 21 05:48:36 crc kubenswrapper[4580]: I0321 05:48:36.534390 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6b77b7676d-kmqjb_21248f61-caf9-4660-8299-3b10368fa8ad/manager/0.log" Mar 21 05:48:36 crc kubenswrapper[4580]: I0321 05:48:36.726697 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5595c7d6ff-nd42d_b56378b1-33b0-4032-a383-49163ca1811d/manager/0.log" Mar 21 05:48:36 crc kubenswrapper[4580]: I0321 05:48:36.887592 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-784c64596-vdvhl_02dbd40b-11b9-4fca-9617-72b7be489626/manager/0.log" Mar 21 05:48:36 crc kubenswrapper[4580]: I0321 05:48:36.912951 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rc6d6" podUID="9b3b89c8-8fd1-4c72-b487-199450439b10" containerName="registry-server" containerID="cri-o://55dd97cb36d6f5e5a8781c8645c49f9610b0066e8526ba1e74d03e8165b8e919" gracePeriod=2 Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.070178 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-fbf7bbb96-mn29l_d45ead43-2f4d-46fc-857f-7e6dbb3e08f6/manager/0.log" Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.451840 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.504757 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6f5b7bcd4-7dkgm_a216d106-9a69-4143-8766-4e505f2b5a8f/manager/0.log" Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.516825 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b3b89c8-8fd1-4c72-b487-199450439b10-utilities\") pod \"9b3b89c8-8fd1-4c72-b487-199450439b10\" (UID: \"9b3b89c8-8fd1-4c72-b487-199450439b10\") " Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.517480 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b89c8-8fd1-4c72-b487-199450439b10-catalog-content\") pod \"9b3b89c8-8fd1-4c72-b487-199450439b10\" (UID: \"9b3b89c8-8fd1-4c72-b487-199450439b10\") " Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.518958 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77xlt\" (UniqueName: \"kubernetes.io/projected/9b3b89c8-8fd1-4c72-b487-199450439b10-kube-api-access-77xlt\") pod \"9b3b89c8-8fd1-4c72-b487-199450439b10\" (UID: \"9b3b89c8-8fd1-4c72-b487-199450439b10\") " Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.521735 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b3b89c8-8fd1-4c72-b487-199450439b10-utilities" (OuterVolumeSpecName: "utilities") pod "9b3b89c8-8fd1-4c72-b487-199450439b10" (UID: "9b3b89c8-8fd1-4c72-b487-199450439b10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.544617 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b3b89c8-8fd1-4c72-b487-199450439b10-kube-api-access-77xlt" (OuterVolumeSpecName: "kube-api-access-77xlt") pod "9b3b89c8-8fd1-4c72-b487-199450439b10" (UID: "9b3b89c8-8fd1-4c72-b487-199450439b10"). InnerVolumeSpecName "kube-api-access-77xlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.623042 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77xlt\" (UniqueName: \"kubernetes.io/projected/9b3b89c8-8fd1-4c72-b487-199450439b10-kube-api-access-77xlt\") on node \"crc\" DevicePath \"\"" Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.623092 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b3b89c8-8fd1-4c72-b487-199450439b10-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.725024 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b3b89c8-8fd1-4c72-b487-199450439b10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b3b89c8-8fd1-4c72-b487-199450439b10" (UID: "9b3b89c8-8fd1-4c72-b487-199450439b10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.726604 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b89c8-8fd1-4c72-b487-199450439b10-catalog-content\") pod \"9b3b89c8-8fd1-4c72-b487-199450439b10\" (UID: \"9b3b89c8-8fd1-4c72-b487-199450439b10\") " Mar 21 05:48:37 crc kubenswrapper[4580]: W0321 05:48:37.731729 4580 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9b3b89c8-8fd1-4c72-b487-199450439b10/volumes/kubernetes.io~empty-dir/catalog-content Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.732278 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b3b89c8-8fd1-4c72-b487-199450439b10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b3b89c8-8fd1-4c72-b487-199450439b10" (UID: "9b3b89c8-8fd1-4c72-b487-199450439b10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.797964 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6744dd545c-sdpcs_6f7dea10-53e8-4c25-87bc-ffd154d4cb7d/manager/0.log" Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.830192 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b89c8-8fd1-4c72-b487-199450439b10-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.924678 4580 generic.go:334] "Generic (PLEG): container finished" podID="9b3b89c8-8fd1-4c72-b487-199450439b10" containerID="55dd97cb36d6f5e5a8781c8645c49f9610b0066e8526ba1e74d03e8165b8e919" exitCode=0 Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.924725 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rc6d6" event={"ID":"9b3b89c8-8fd1-4c72-b487-199450439b10","Type":"ContainerDied","Data":"55dd97cb36d6f5e5a8781c8645c49f9610b0066e8526ba1e74d03e8165b8e919"} Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.924750 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rc6d6" event={"ID":"9b3b89c8-8fd1-4c72-b487-199450439b10","Type":"ContainerDied","Data":"f6aeb32e0a608969979d4e5b29d95eb9e17078dee24dc77e32d8b4b67e40ca5b"} Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.924766 4580 scope.go:117] "RemoveContainer" containerID="55dd97cb36d6f5e5a8781c8645c49f9610b0066e8526ba1e74d03e8165b8e919" Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.924770 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rc6d6" Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.978098 4580 scope.go:117] "RemoveContainer" containerID="84099185e9858dc85e9a1e460d0f25784a358d4ce238403b63d13033c3033c09" Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.995867 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rc6d6"] Mar 21 05:48:37 crc kubenswrapper[4580]: I0321 05:48:37.997758 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-bc5c78db9-d5jll_2535b22b-0bed-4ffd-9430-ca9fb3230c62/manager/0.log" Mar 21 05:48:38 crc kubenswrapper[4580]: I0321 05:48:38.005398 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rc6d6"] Mar 21 05:48:38 crc kubenswrapper[4580]: I0321 05:48:38.067179 4580 scope.go:117] "RemoveContainer" containerID="6648488027b56aa5842a89bb093e874a3f7fdc87c2975e4b9593c118d7987fec" Mar 21 05:48:38 crc kubenswrapper[4580]: I0321 05:48:38.102842 4580 scope.go:117] "RemoveContainer" containerID="55dd97cb36d6f5e5a8781c8645c49f9610b0066e8526ba1e74d03e8165b8e919" Mar 21 05:48:38 crc kubenswrapper[4580]: E0321 05:48:38.103907 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55dd97cb36d6f5e5a8781c8645c49f9610b0066e8526ba1e74d03e8165b8e919\": container with ID starting with 55dd97cb36d6f5e5a8781c8645c49f9610b0066e8526ba1e74d03e8165b8e919 not found: ID does not exist" containerID="55dd97cb36d6f5e5a8781c8645c49f9610b0066e8526ba1e74d03e8165b8e919" Mar 21 05:48:38 crc kubenswrapper[4580]: I0321 05:48:38.105055 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55dd97cb36d6f5e5a8781c8645c49f9610b0066e8526ba1e74d03e8165b8e919"} err="failed to get container status \"55dd97cb36d6f5e5a8781c8645c49f9610b0066e8526ba1e74d03e8165b8e919\": rpc error: code = NotFound desc = could not find container \"55dd97cb36d6f5e5a8781c8645c49f9610b0066e8526ba1e74d03e8165b8e919\": container with ID starting with 55dd97cb36d6f5e5a8781c8645c49f9610b0066e8526ba1e74d03e8165b8e919 not found: ID does not exist" Mar 21 05:48:38 crc kubenswrapper[4580]: I0321 05:48:38.105086 4580 scope.go:117] "RemoveContainer" containerID="84099185e9858dc85e9a1e460d0f25784a358d4ce238403b63d13033c3033c09" Mar 21 05:48:38 crc kubenswrapper[4580]: E0321 05:48:38.105379 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84099185e9858dc85e9a1e460d0f25784a358d4ce238403b63d13033c3033c09\": container with ID starting with 84099185e9858dc85e9a1e460d0f25784a358d4ce238403b63d13033c3033c09 not found: ID does not exist" containerID="84099185e9858dc85e9a1e460d0f25784a358d4ce238403b63d13033c3033c09" Mar 21 05:48:38 crc kubenswrapper[4580]: I0321 05:48:38.105482 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84099185e9858dc85e9a1e460d0f25784a358d4ce238403b63d13033c3033c09"} err="failed to get container status \"84099185e9858dc85e9a1e460d0f25784a358d4ce238403b63d13033c3033c09\": rpc error: code = NotFound desc = could not find container \"84099185e9858dc85e9a1e460d0f25784a358d4ce238403b63d13033c3033c09\": container with ID starting with 84099185e9858dc85e9a1e460d0f25784a358d4ce238403b63d13033c3033c09 not found: ID does not exist" Mar 21 05:48:38 crc kubenswrapper[4580]: I0321 05:48:38.105560 4580 scope.go:117] "RemoveContainer" containerID="6648488027b56aa5842a89bb093e874a3f7fdc87c2975e4b9593c118d7987fec" Mar 21 05:48:38 crc kubenswrapper[4580]: E0321 05:48:38.116068 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6648488027b56aa5842a89bb093e874a3f7fdc87c2975e4b9593c118d7987fec\": container with ID starting with 6648488027b56aa5842a89bb093e874a3f7fdc87c2975e4b9593c118d7987fec not found: ID does not exist" containerID="6648488027b56aa5842a89bb093e874a3f7fdc87c2975e4b9593c118d7987fec" Mar 21 05:48:38 crc kubenswrapper[4580]: I0321 05:48:38.116232 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6648488027b56aa5842a89bb093e874a3f7fdc87c2975e4b9593c118d7987fec"} err="failed to get container status \"6648488027b56aa5842a89bb093e874a3f7fdc87c2975e4b9593c118d7987fec\": rpc error: code = NotFound desc = could not find container \"6648488027b56aa5842a89bb093e874a3f7fdc87c2975e4b9593c118d7987fec\": container with ID starting with 6648488027b56aa5842a89bb093e874a3f7fdc87c2975e4b9593c118d7987fec not found: ID does not exist" Mar 21 05:48:38 crc kubenswrapper[4580]: I0321 05:48:38.322389 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-56f74467c6-2h95r_67d1d125-57c7-4c30-a51a-24db28fb4818/manager/0.log" Mar 21 05:48:38 crc kubenswrapper[4580]: I0321 05:48:38.539512 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v_2e366c15-abc4-4e05-9054-cd7828e00059/manager/0.log" Mar 21 05:48:38 crc kubenswrapper[4580]: I0321 05:48:38.806704 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-68b88cfb78-cf92m_d316a65b-0041-42cc-bf46-c6c8801c44a5/operator/0.log" Mar 21 05:48:39 crc kubenswrapper[4580]: I0321 05:48:39.106351 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-r2tm2_0b8cf1e5-6f84-4595-be65-efc781baa914/registry-server/0.log" Mar 21 05:48:39 crc kubenswrapper[4580]: I0321 05:48:39.332576 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5cfd84c587-6h2nr_d3c84591-dfcf-48e6-a022-25562660675e/manager/0.log" Mar 21 05:48:39 crc kubenswrapper[4580]: I0321 05:48:39.538731 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-846c4cdcb7-kg5cb_bd6a540f-0a1b-4098-8573-b9049d52f49b/manager/0.log" Mar 21 05:48:39 crc kubenswrapper[4580]: I0321 05:48:39.642316 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b3b89c8-8fd1-4c72-b487-199450439b10" path="/var/lib/kubelet/pods/9b3b89c8-8fd1-4c72-b487-199450439b10/volumes" Mar 21 05:48:39 crc kubenswrapper[4580]: I0321 05:48:39.705116 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-659fb58c6b-ln984_0e31c4f0-9b9d-4c10-84de-d15718775f9a/manager/0.log" Mar 21 05:48:40 crc kubenswrapper[4580]: I0321 05:48:40.082328 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5mdkx_92286fdb-b69e-4028-8a93-3517469a731c/operator/0.log" Mar 21 05:48:40 crc kubenswrapper[4580]: I0321 05:48:40.118661 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-867f54bc44-78zl5_8b025526-696f-4d7d-82ed-df03050fa1fd/manager/0.log" Mar 21 05:48:40 crc kubenswrapper[4580]: I0321 05:48:40.395765 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d84559f47-g52cx_2a0c721d-68cd-46de-8292-6bd8373e1106/manager/0.log" Mar 21 05:48:40 crc kubenswrapper[4580]: I0321 05:48:40.461884 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-684bbdfff8-7nr7w_19a22b87-c6f3-4020-aa11-2a940041f49c/manager/0.log" Mar 21 05:48:40 crc kubenswrapper[4580]: I0321 05:48:40.592389 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-rqw4v_5992ccfd-4585-49b2-84ff-3f1fe6812a82/manager/0.log" Mar 21 05:48:40 crc kubenswrapper[4580]: I0321 05:48:40.771977 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-74d6f7b5c-q8jxn_6080f6a7-a68f-447a-bedd-182cd69337b5/manager/0.log" Mar 21 05:48:42 crc kubenswrapper[4580]: I0321 05:48:42.617471 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:48:42 crc kubenswrapper[4580]: E0321 05:48:42.618112 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:48:56 crc kubenswrapper[4580]: I0321 05:48:56.618182 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:48:56 crc kubenswrapper[4580]: E0321 05:48:56.618815 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:49:03 crc kubenswrapper[4580]: I0321 05:49:03.871488 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jjvsm_48bae9ea-47af-484e-a8c4-b6c3e49438e5/control-plane-machine-set-operator/0.log" Mar 21 05:49:04 crc kubenswrapper[4580]: I0321 05:49:04.130575 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cvqg4_da00c0bc-2ff1-4b15-be1f-8fac48921976/kube-rbac-proxy/0.log" Mar 21 05:49:04 crc kubenswrapper[4580]: I0321 05:49:04.134683 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cvqg4_da00c0bc-2ff1-4b15-be1f-8fac48921976/machine-api-operator/0.log" Mar 21 05:49:10 crc kubenswrapper[4580]: I0321 05:49:10.617891 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:49:10 crc kubenswrapper[4580]: E0321 05:49:10.618618 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:49:15 crc kubenswrapper[4580]: I0321 05:49:15.252130 4580 scope.go:117] "RemoveContainer" containerID="74a2cd779dbe0f77366a7563a48b4eec978d36acf992caba34164deef464e369" Mar 21 05:49:20 crc kubenswrapper[4580]: I0321 05:49:20.613113 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-ssv6b_cc874172-d2d5-4811-8b52-8822da8cb97f/cert-manager-controller/0.log" Mar 21 05:49:20 crc kubenswrapper[4580]: I0321 05:49:20.947497 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-kcmc2_7154e53a-0974-4463-9d9a-20cea09f0e94/cert-manager-webhook/0.log" Mar 21 05:49:20 crc kubenswrapper[4580]: I0321 05:49:20.955340 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-plhxw_cdd054c6-1468-4d34-866c-612b69c7bb4f/cert-manager-cainjector/0.log" Mar 21 05:49:21 crc kubenswrapper[4580]: I0321 05:49:21.620683 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:49:21 crc kubenswrapper[4580]: E0321 05:49:21.623011 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:49:26 crc kubenswrapper[4580]: I0321 05:49:26.935268 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vqczp"] Mar 21 05:49:26 crc kubenswrapper[4580]: E0321 05:49:26.936320 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3b89c8-8fd1-4c72-b487-199450439b10" containerName="extract-content" Mar 21 05:49:26 crc kubenswrapper[4580]: I0321 05:49:26.936340 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3b89c8-8fd1-4c72-b487-199450439b10" containerName="extract-content" Mar 21 05:49:26 crc kubenswrapper[4580]: E0321 05:49:26.936364 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3b89c8-8fd1-4c72-b487-199450439b10" containerName="registry-server" Mar 21 05:49:26 crc kubenswrapper[4580]: I0321 05:49:26.936372 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3b89c8-8fd1-4c72-b487-199450439b10" containerName="registry-server" Mar 21 05:49:26 crc kubenswrapper[4580]: E0321 05:49:26.936391 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9730819c-907e-4985-858b-dbed32715065" containerName="oc" Mar 21 05:49:26 crc kubenswrapper[4580]: I0321 05:49:26.936400 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9730819c-907e-4985-858b-dbed32715065" containerName="oc" Mar 21 05:49:26 crc kubenswrapper[4580]: E0321 05:49:26.936421 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3b89c8-8fd1-4c72-b487-199450439b10" containerName="extract-utilities" Mar 21 05:49:26 crc kubenswrapper[4580]: I0321 05:49:26.936428 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3b89c8-8fd1-4c72-b487-199450439b10" containerName="extract-utilities" Mar 21 05:49:26 crc kubenswrapper[4580]: I0321 05:49:26.936627 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b3b89c8-8fd1-4c72-b487-199450439b10" containerName="registry-server" Mar 21 05:49:26 crc kubenswrapper[4580]: I0321 05:49:26.936660 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="9730819c-907e-4985-858b-dbed32715065" containerName="oc" Mar 21 05:49:26 crc kubenswrapper[4580]: I0321 05:49:26.938066 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:26 crc kubenswrapper[4580]: I0321 05:49:26.957549 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqczp"] Mar 21 05:49:27 crc kubenswrapper[4580]: I0321 05:49:27.006298 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83edc420-7b1e-4e32-8a9d-dc63927e3c17-catalog-content\") pod \"community-operators-vqczp\" (UID: \"83edc420-7b1e-4e32-8a9d-dc63927e3c17\") " pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:27 crc kubenswrapper[4580]: I0321 05:49:27.007378 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83edc420-7b1e-4e32-8a9d-dc63927e3c17-utilities\") pod \"community-operators-vqczp\" (UID: \"83edc420-7b1e-4e32-8a9d-dc63927e3c17\") " pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:27 crc kubenswrapper[4580]: I0321 05:49:27.007577 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzn7b\" (UniqueName: \"kubernetes.io/projected/83edc420-7b1e-4e32-8a9d-dc63927e3c17-kube-api-access-bzn7b\") pod \"community-operators-vqczp\" (UID: \"83edc420-7b1e-4e32-8a9d-dc63927e3c17\") " pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:27 crc kubenswrapper[4580]: I0321 05:49:27.109732 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83edc420-7b1e-4e32-8a9d-dc63927e3c17-catalog-content\") pod \"community-operators-vqczp\" (UID: \"83edc420-7b1e-4e32-8a9d-dc63927e3c17\") " pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:27 crc kubenswrapper[4580]: I0321 05:49:27.110612 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83edc420-7b1e-4e32-8a9d-dc63927e3c17-utilities\") pod \"community-operators-vqczp\" (UID: \"83edc420-7b1e-4e32-8a9d-dc63927e3c17\") " pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:27 crc kubenswrapper[4580]: I0321 05:49:27.111010 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzn7b\" (UniqueName: \"kubernetes.io/projected/83edc420-7b1e-4e32-8a9d-dc63927e3c17-kube-api-access-bzn7b\") pod \"community-operators-vqczp\" (UID: \"83edc420-7b1e-4e32-8a9d-dc63927e3c17\") " pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:27 crc kubenswrapper[4580]: I0321 05:49:27.110965 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83edc420-7b1e-4e32-8a9d-dc63927e3c17-utilities\") pod \"community-operators-vqczp\" (UID: \"83edc420-7b1e-4e32-8a9d-dc63927e3c17\") " pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:27 crc kubenswrapper[4580]: I0321 05:49:27.110520 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83edc420-7b1e-4e32-8a9d-dc63927e3c17-catalog-content\") pod \"community-operators-vqczp\" (UID: \"83edc420-7b1e-4e32-8a9d-dc63927e3c17\") " pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:27 crc kubenswrapper[4580]: I0321 05:49:27.148071 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzn7b\" (UniqueName: \"kubernetes.io/projected/83edc420-7b1e-4e32-8a9d-dc63927e3c17-kube-api-access-bzn7b\") pod \"community-operators-vqczp\" (UID: \"83edc420-7b1e-4e32-8a9d-dc63927e3c17\") " pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:27 crc kubenswrapper[4580]: I0321 05:49:27.277216 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:27 crc kubenswrapper[4580]: I0321 05:49:27.987849 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqczp"] Mar 21 05:49:28 crc kubenswrapper[4580]: I0321 05:49:28.412650 4580 generic.go:334] "Generic (PLEG): container finished" podID="83edc420-7b1e-4e32-8a9d-dc63927e3c17" containerID="6f2276ab6c6b6aef0fcc0f1645197d5247dfde22e04149afd3efd4954da88a40" exitCode=0 Mar 21 05:49:28 crc kubenswrapper[4580]: I0321 05:49:28.412735 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqczp" event={"ID":"83edc420-7b1e-4e32-8a9d-dc63927e3c17","Type":"ContainerDied","Data":"6f2276ab6c6b6aef0fcc0f1645197d5247dfde22e04149afd3efd4954da88a40"} Mar 21 05:49:28 crc kubenswrapper[4580]: I0321 05:49:28.412981 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqczp" event={"ID":"83edc420-7b1e-4e32-8a9d-dc63927e3c17","Type":"ContainerStarted","Data":"3ecaaa18f6a350a8e9b07b164f9c460ada30636c807e18d9ce9657123d921cb8"} Mar 21 05:49:29 crc kubenswrapper[4580]: I0321 05:49:29.423913 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqczp" event={"ID":"83edc420-7b1e-4e32-8a9d-dc63927e3c17","Type":"ContainerStarted","Data":"e2bf7bf065a1f4d93d0b6457d8e3e39eedd792daba1fb3fbf1f95a1f047610c9"} Mar 21 05:49:31 crc kubenswrapper[4580]: I0321 05:49:31.442578 4580 generic.go:334] "Generic (PLEG): container finished" podID="83edc420-7b1e-4e32-8a9d-dc63927e3c17" containerID="e2bf7bf065a1f4d93d0b6457d8e3e39eedd792daba1fb3fbf1f95a1f047610c9" exitCode=0 Mar 21 05:49:31 crc kubenswrapper[4580]: I0321 05:49:31.442666 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqczp" event={"ID":"83edc420-7b1e-4e32-8a9d-dc63927e3c17","Type":"ContainerDied","Data":"e2bf7bf065a1f4d93d0b6457d8e3e39eedd792daba1fb3fbf1f95a1f047610c9"} Mar 21 05:49:32 crc kubenswrapper[4580]: I0321 05:49:32.453589 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqczp" event={"ID":"83edc420-7b1e-4e32-8a9d-dc63927e3c17","Type":"ContainerStarted","Data":"a3988ec347bc7b42024f3e096695df8e48ca7aab88b273ed0626dafb609fb9e9"} Mar 21 05:49:32 crc kubenswrapper[4580]: I0321 05:49:32.480361 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vqczp" podStartSLOduration=2.978312794 podStartE2EDuration="6.480338896s" podCreationTimestamp="2026-03-21 05:49:26 +0000 UTC" firstStartedPulling="2026-03-21 05:49:28.414104281 +0000 UTC m=+3473.496687909" lastFinishedPulling="2026-03-21 05:49:31.916130383 +0000 UTC m=+3476.998714011" observedRunningTime="2026-03-21 05:49:32.474578352 +0000 UTC m=+3477.557161980" watchObservedRunningTime="2026-03-21 05:49:32.480338896 +0000 UTC m=+3477.562922524" Mar 21 05:49:32 crc kubenswrapper[4580]: I0321 05:49:32.618061 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:49:32 crc kubenswrapper[4580]: E0321 05:49:32.618315 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:49:37 crc kubenswrapper[4580]: I0321 05:49:37.070965 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-6t69s_3f4e68cb-70a1-40bd-815d-e35e0a3337a0/nmstate-console-plugin/0.log" Mar 21 05:49:37 crc kubenswrapper[4580]: I0321 05:49:37.275544 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-ftlkm_c458e5d3-8c0a-4135-aba8-54854b16c411/nmstate-handler/0.log" Mar 21 05:49:37 crc kubenswrapper[4580]: I0321 05:49:37.278215 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:37 crc kubenswrapper[4580]: I0321 05:49:37.278266 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:37 crc kubenswrapper[4580]: I0321 05:49:37.324452 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-zfbrs_1b699e53-ece3-49d9-9f68-c3558aef7892/kube-rbac-proxy/0.log" Mar 21 05:49:37 crc kubenswrapper[4580]: I0321 05:49:37.341214 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:37 crc kubenswrapper[4580]: I0321 05:49:37.409572 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-zfbrs_1b699e53-ece3-49d9-9f68-c3558aef7892/nmstate-metrics/0.log" Mar 21 05:49:37 crc kubenswrapper[4580]: I0321 05:49:37.566566 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:37 crc kubenswrapper[4580]: I0321 05:49:37.573804 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-2fctj_b980d3d5-4f25-47c0-9679-8662b237e1b7/nmstate-operator/0.log" Mar 21 05:49:37 crc kubenswrapper[4580]: I0321 05:49:37.609230 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqczp"] Mar 21 05:49:37 crc kubenswrapper[4580]: I0321 05:49:37.673581 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-w4mjf_449ae922-f55a-437e-b18a-d6e2700cc02e/nmstate-webhook/0.log" Mar 21 05:49:39 crc kubenswrapper[4580]: I0321 05:49:39.539898 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vqczp" podUID="83edc420-7b1e-4e32-8a9d-dc63927e3c17" containerName="registry-server" containerID="cri-o://a3988ec347bc7b42024f3e096695df8e48ca7aab88b273ed0626dafb609fb9e9" gracePeriod=2 Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.045409 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.089915 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83edc420-7b1e-4e32-8a9d-dc63927e3c17-catalog-content\") pod \"83edc420-7b1e-4e32-8a9d-dc63927e3c17\" (UID: \"83edc420-7b1e-4e32-8a9d-dc63927e3c17\") " Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.092689 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83edc420-7b1e-4e32-8a9d-dc63927e3c17-utilities\") pod \"83edc420-7b1e-4e32-8a9d-dc63927e3c17\" (UID: \"83edc420-7b1e-4e32-8a9d-dc63927e3c17\") " Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.093484 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzn7b\" (UniqueName: \"kubernetes.io/projected/83edc420-7b1e-4e32-8a9d-dc63927e3c17-kube-api-access-bzn7b\") pod \"83edc420-7b1e-4e32-8a9d-dc63927e3c17\" (UID: \"83edc420-7b1e-4e32-8a9d-dc63927e3c17\") " Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.097883 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83edc420-7b1e-4e32-8a9d-dc63927e3c17-utilities" (OuterVolumeSpecName: "utilities") pod "83edc420-7b1e-4e32-8a9d-dc63927e3c17" (UID: "83edc420-7b1e-4e32-8a9d-dc63927e3c17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.106409 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83edc420-7b1e-4e32-8a9d-dc63927e3c17-kube-api-access-bzn7b" (OuterVolumeSpecName: "kube-api-access-bzn7b") pod "83edc420-7b1e-4e32-8a9d-dc63927e3c17" (UID: "83edc420-7b1e-4e32-8a9d-dc63927e3c17"). InnerVolumeSpecName "kube-api-access-bzn7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.155912 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83edc420-7b1e-4e32-8a9d-dc63927e3c17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83edc420-7b1e-4e32-8a9d-dc63927e3c17" (UID: "83edc420-7b1e-4e32-8a9d-dc63927e3c17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.196554 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83edc420-7b1e-4e32-8a9d-dc63927e3c17-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.196594 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzn7b\" (UniqueName: \"kubernetes.io/projected/83edc420-7b1e-4e32-8a9d-dc63927e3c17-kube-api-access-bzn7b\") on node \"crc\" DevicePath \"\"" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.196606 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83edc420-7b1e-4e32-8a9d-dc63927e3c17-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.549870 4580 generic.go:334] "Generic (PLEG): container finished" podID="83edc420-7b1e-4e32-8a9d-dc63927e3c17" containerID="a3988ec347bc7b42024f3e096695df8e48ca7aab88b273ed0626dafb609fb9e9" exitCode=0 Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.549969 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqczp" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.549989 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqczp" event={"ID":"83edc420-7b1e-4e32-8a9d-dc63927e3c17","Type":"ContainerDied","Data":"a3988ec347bc7b42024f3e096695df8e48ca7aab88b273ed0626dafb609fb9e9"} Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.551109 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqczp" event={"ID":"83edc420-7b1e-4e32-8a9d-dc63927e3c17","Type":"ContainerDied","Data":"3ecaaa18f6a350a8e9b07b164f9c460ada30636c807e18d9ce9657123d921cb8"} Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.551140 4580 scope.go:117] "RemoveContainer" containerID="a3988ec347bc7b42024f3e096695df8e48ca7aab88b273ed0626dafb609fb9e9" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.580380 4580 scope.go:117] "RemoveContainer" containerID="e2bf7bf065a1f4d93d0b6457d8e3e39eedd792daba1fb3fbf1f95a1f047610c9" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.599856 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqczp"] Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.608269 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vqczp"] Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.608538 4580 scope.go:117] "RemoveContainer" containerID="6f2276ab6c6b6aef0fcc0f1645197d5247dfde22e04149afd3efd4954da88a40" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.645684 4580 scope.go:117] "RemoveContainer" containerID="a3988ec347bc7b42024f3e096695df8e48ca7aab88b273ed0626dafb609fb9e9" Mar 21 05:49:40 crc kubenswrapper[4580]: E0321 05:49:40.646210 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3988ec347bc7b42024f3e096695df8e48ca7aab88b273ed0626dafb609fb9e9\": container with ID starting with a3988ec347bc7b42024f3e096695df8e48ca7aab88b273ed0626dafb609fb9e9 not found: ID does not exist" containerID="a3988ec347bc7b42024f3e096695df8e48ca7aab88b273ed0626dafb609fb9e9" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.646250 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3988ec347bc7b42024f3e096695df8e48ca7aab88b273ed0626dafb609fb9e9"} err="failed to get container status \"a3988ec347bc7b42024f3e096695df8e48ca7aab88b273ed0626dafb609fb9e9\": rpc error: code = NotFound desc = could not find container \"a3988ec347bc7b42024f3e096695df8e48ca7aab88b273ed0626dafb609fb9e9\": container with ID starting with a3988ec347bc7b42024f3e096695df8e48ca7aab88b273ed0626dafb609fb9e9 not found: ID does not exist" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.646277 4580 scope.go:117] "RemoveContainer" containerID="e2bf7bf065a1f4d93d0b6457d8e3e39eedd792daba1fb3fbf1f95a1f047610c9" Mar 21 05:49:40 crc kubenswrapper[4580]: E0321 05:49:40.646581 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2bf7bf065a1f4d93d0b6457d8e3e39eedd792daba1fb3fbf1f95a1f047610c9\": container with ID starting with e2bf7bf065a1f4d93d0b6457d8e3e39eedd792daba1fb3fbf1f95a1f047610c9 not found: ID does not exist" containerID="e2bf7bf065a1f4d93d0b6457d8e3e39eedd792daba1fb3fbf1f95a1f047610c9" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.646620 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2bf7bf065a1f4d93d0b6457d8e3e39eedd792daba1fb3fbf1f95a1f047610c9"} err="failed to get container status \"e2bf7bf065a1f4d93d0b6457d8e3e39eedd792daba1fb3fbf1f95a1f047610c9\": rpc error: code = NotFound desc = could not find container \"e2bf7bf065a1f4d93d0b6457d8e3e39eedd792daba1fb3fbf1f95a1f047610c9\": container with ID starting with e2bf7bf065a1f4d93d0b6457d8e3e39eedd792daba1fb3fbf1f95a1f047610c9 not found: ID does not exist" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.646634 4580 scope.go:117] "RemoveContainer" containerID="6f2276ab6c6b6aef0fcc0f1645197d5247dfde22e04149afd3efd4954da88a40" Mar 21 05:49:40 crc kubenswrapper[4580]: E0321 05:49:40.646920 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f2276ab6c6b6aef0fcc0f1645197d5247dfde22e04149afd3efd4954da88a40\": container with ID starting with 6f2276ab6c6b6aef0fcc0f1645197d5247dfde22e04149afd3efd4954da88a40 not found: ID does not exist" containerID="6f2276ab6c6b6aef0fcc0f1645197d5247dfde22e04149afd3efd4954da88a40" Mar 21 05:49:40 crc kubenswrapper[4580]: I0321 05:49:40.646943 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2276ab6c6b6aef0fcc0f1645197d5247dfde22e04149afd3efd4954da88a40"} err="failed to get container status \"6f2276ab6c6b6aef0fcc0f1645197d5247dfde22e04149afd3efd4954da88a40\": rpc error: code = NotFound desc = could not find container \"6f2276ab6c6b6aef0fcc0f1645197d5247dfde22e04149afd3efd4954da88a40\": container with ID starting with 6f2276ab6c6b6aef0fcc0f1645197d5247dfde22e04149afd3efd4954da88a40 not found: ID does not exist" Mar 21 05:49:41 crc kubenswrapper[4580]: I0321 05:49:41.630037 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83edc420-7b1e-4e32-8a9d-dc63927e3c17" path="/var/lib/kubelet/pods/83edc420-7b1e-4e32-8a9d-dc63927e3c17/volumes" Mar 21 05:49:45 crc kubenswrapper[4580]: I0321 05:49:45.638769 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:49:45 crc kubenswrapper[4580]: E0321 05:49:45.640047 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.148019 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567870-np7k2"] Mar 21 05:50:00 crc kubenswrapper[4580]: E0321 05:50:00.148922 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83edc420-7b1e-4e32-8a9d-dc63927e3c17" containerName="extract-utilities" Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.148936 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="83edc420-7b1e-4e32-8a9d-dc63927e3c17" containerName="extract-utilities" Mar 21 05:50:00 crc kubenswrapper[4580]: E0321 05:50:00.148957 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83edc420-7b1e-4e32-8a9d-dc63927e3c17" containerName="extract-content" Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.148964 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="83edc420-7b1e-4e32-8a9d-dc63927e3c17" containerName="extract-content" Mar 21 05:50:00 crc kubenswrapper[4580]: E0321 05:50:00.148990 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83edc420-7b1e-4e32-8a9d-dc63927e3c17" containerName="registry-server" Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.148997 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="83edc420-7b1e-4e32-8a9d-dc63927e3c17" containerName="registry-server" Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.149181 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="83edc420-7b1e-4e32-8a9d-dc63927e3c17" containerName="registry-server" Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.149766 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567870-np7k2" Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.152087 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.152569 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.154101 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.163558 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567870-np7k2"] Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.275106 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckdm6\" (UniqueName: \"kubernetes.io/projected/0dd778e7-6536-464a-8ce1-e9a26c7d8c58-kube-api-access-ckdm6\") pod \"auto-csr-approver-29567870-np7k2\" (UID: \"0dd778e7-6536-464a-8ce1-e9a26c7d8c58\") " pod="openshift-infra/auto-csr-approver-29567870-np7k2" Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.376559 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckdm6\" (UniqueName: \"kubernetes.io/projected/0dd778e7-6536-464a-8ce1-e9a26c7d8c58-kube-api-access-ckdm6\") pod \"auto-csr-approver-29567870-np7k2\" (UID: \"0dd778e7-6536-464a-8ce1-e9a26c7d8c58\") " pod="openshift-infra/auto-csr-approver-29567870-np7k2" Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.404629 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckdm6\" (UniqueName: \"kubernetes.io/projected/0dd778e7-6536-464a-8ce1-e9a26c7d8c58-kube-api-access-ckdm6\") pod \"auto-csr-approver-29567870-np7k2\" (UID: \"0dd778e7-6536-464a-8ce1-e9a26c7d8c58\") " pod="openshift-infra/auto-csr-approver-29567870-np7k2" Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.471198 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567870-np7k2" Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.617402 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:50:00 crc kubenswrapper[4580]: E0321 05:50:00.618053 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:50:00 crc kubenswrapper[4580]: I0321 05:50:00.939140 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567870-np7k2"] Mar 21 05:50:01 crc kubenswrapper[4580]: I0321 05:50:01.722732 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567870-np7k2" event={"ID":"0dd778e7-6536-464a-8ce1-e9a26c7d8c58","Type":"ContainerStarted","Data":"7a33448eb86a0c0eaff3d2459583c8a0d8061fc18070171a9b42864aea763c39"} Mar 21 05:50:02 crc kubenswrapper[4580]: I0321 05:50:02.732738 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567870-np7k2" event={"ID":"0dd778e7-6536-464a-8ce1-e9a26c7d8c58","Type":"ContainerStarted","Data":"b70ae59290c3cc82e651e88234901a2d9a69ac07c47a3c956080528e0834621f"} Mar 21 05:50:02 crc kubenswrapper[4580]: I0321 05:50:02.756755 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567870-np7k2" podStartSLOduration=1.6444208919999999 podStartE2EDuration="2.756734087s" podCreationTimestamp="2026-03-21 05:50:00 +0000 UTC" firstStartedPulling="2026-03-21 05:50:00.945021358 +0000 UTC m=+3506.027604976" lastFinishedPulling="2026-03-21 05:50:02.057334543 +0000 UTC m=+3507.139918171" observedRunningTime="2026-03-21 05:50:02.750502289 +0000 UTC m=+3507.833085917" watchObservedRunningTime="2026-03-21 05:50:02.756734087 +0000 UTC m=+3507.839317725" Mar 21 05:50:03 crc kubenswrapper[4580]: I0321 05:50:03.746198 4580 generic.go:334] "Generic (PLEG): container finished" podID="0dd778e7-6536-464a-8ce1-e9a26c7d8c58" containerID="b70ae59290c3cc82e651e88234901a2d9a69ac07c47a3c956080528e0834621f" exitCode=0 Mar 21 05:50:03 crc kubenswrapper[4580]: I0321 05:50:03.746287 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567870-np7k2" event={"ID":"0dd778e7-6536-464a-8ce1-e9a26c7d8c58","Type":"ContainerDied","Data":"b70ae59290c3cc82e651e88234901a2d9a69ac07c47a3c956080528e0834621f"} Mar 21 05:50:05 crc kubenswrapper[4580]: I0321 05:50:05.136449 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567870-np7k2" Mar 21 05:50:05 crc kubenswrapper[4580]: I0321 05:50:05.280371 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckdm6\" (UniqueName: \"kubernetes.io/projected/0dd778e7-6536-464a-8ce1-e9a26c7d8c58-kube-api-access-ckdm6\") pod \"0dd778e7-6536-464a-8ce1-e9a26c7d8c58\" (UID: \"0dd778e7-6536-464a-8ce1-e9a26c7d8c58\") " Mar 21 05:50:05 crc kubenswrapper[4580]: I0321 05:50:05.286804 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd778e7-6536-464a-8ce1-e9a26c7d8c58-kube-api-access-ckdm6" (OuterVolumeSpecName: "kube-api-access-ckdm6") pod "0dd778e7-6536-464a-8ce1-e9a26c7d8c58" (UID: "0dd778e7-6536-464a-8ce1-e9a26c7d8c58"). InnerVolumeSpecName "kube-api-access-ckdm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:50:05 crc kubenswrapper[4580]: I0321 05:50:05.382868 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckdm6\" (UniqueName: \"kubernetes.io/projected/0dd778e7-6536-464a-8ce1-e9a26c7d8c58-kube-api-access-ckdm6\") on node \"crc\" DevicePath \"\"" Mar 21 05:50:05 crc kubenswrapper[4580]: I0321 05:50:05.764547 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567870-np7k2" event={"ID":"0dd778e7-6536-464a-8ce1-e9a26c7d8c58","Type":"ContainerDied","Data":"7a33448eb86a0c0eaff3d2459583c8a0d8061fc18070171a9b42864aea763c39"} Mar 21 05:50:05 crc kubenswrapper[4580]: I0321 05:50:05.764585 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a33448eb86a0c0eaff3d2459583c8a0d8061fc18070171a9b42864aea763c39" Mar 21 05:50:05 crc kubenswrapper[4580]: I0321 05:50:05.764655 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567870-np7k2" Mar 21 05:50:05 crc kubenswrapper[4580]: I0321 05:50:05.829567 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567864-d4qkl"] Mar 21 05:50:05 crc kubenswrapper[4580]: I0321 05:50:05.837907 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567864-d4qkl"] Mar 21 05:50:07 crc kubenswrapper[4580]: I0321 05:50:07.628520 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48ad4a5c-b320-48f3-b044-b2626b51e069" path="/var/lib/kubelet/pods/48ad4a5c-b320-48f3-b044-b2626b51e069/volumes" Mar 21 05:50:09 crc kubenswrapper[4580]: I0321 05:50:09.436038 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-vlfhg_ab7c431f-e254-4c36-a240-15ec5cbb14e9/kube-rbac-proxy/0.log" Mar 21 05:50:09 crc kubenswrapper[4580]: I0321 05:50:09.570535 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-vlfhg_ab7c431f-e254-4c36-a240-15ec5cbb14e9/controller/0.log" Mar 21 05:50:09 crc kubenswrapper[4580]: I0321 05:50:09.733117 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-frr-files/0.log" Mar 21 05:50:09 crc kubenswrapper[4580]: I0321 05:50:09.980485 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-reloader/0.log" Mar 21 05:50:10 crc kubenswrapper[4580]: I0321 05:50:10.036565 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-frr-files/0.log" Mar 21 05:50:10 crc kubenswrapper[4580]: I0321 05:50:10.068738 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-reloader/0.log" Mar 21 05:50:10 crc kubenswrapper[4580]: I0321 05:50:10.093365 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-metrics/0.log" Mar 21 05:50:10 crc kubenswrapper[4580]: I0321 05:50:10.243030 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-reloader/0.log" Mar 21 05:50:10 crc kubenswrapper[4580]: I0321 05:50:10.273212 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-metrics/0.log" Mar 21 05:50:10 crc kubenswrapper[4580]: I0321 05:50:10.302621 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-frr-files/0.log" Mar 21 05:50:10 crc kubenswrapper[4580]: I0321 05:50:10.314993 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-metrics/0.log" Mar 21 05:50:10 crc kubenswrapper[4580]: I0321 05:50:10.545818 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-frr-files/0.log" Mar 21 05:50:10 crc kubenswrapper[4580]: I0321 05:50:10.626090 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-metrics/0.log" Mar 21 05:50:10 crc kubenswrapper[4580]: I0321 05:50:10.648331 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-reloader/0.log" Mar 21 05:50:10 crc kubenswrapper[4580]: I0321 05:50:10.660886 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/controller/0.log" Mar 21 05:50:10 crc kubenswrapper[4580]: I0321 05:50:10.889653 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/frr-metrics/0.log" Mar 21 05:50:11 crc kubenswrapper[4580]: I0321 05:50:11.006955 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/kube-rbac-proxy/0.log" Mar 21 05:50:11 crc kubenswrapper[4580]: I0321 05:50:11.008235 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/kube-rbac-proxy-frr/0.log" Mar 21 05:50:11 crc kubenswrapper[4580]: I0321 05:50:11.179901 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/reloader/0.log" Mar 21 05:50:11 crc kubenswrapper[4580]: I0321 05:50:11.306640 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-qdhgl_254226c1-a87d-4bca-a3d4-a909452fa9ac/frr-k8s-webhook-server/0.log" Mar 21 05:50:11 crc kubenswrapper[4580]: I0321 05:50:11.896803 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-cf889bd6-5qrxw_22ee4637-a40f-4200-be5a-679e0912f4cf/manager/0.log" Mar 21 05:50:12 crc kubenswrapper[4580]: I0321 05:50:12.020018 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/frr/0.log" Mar 21 05:50:12 crc kubenswrapper[4580]: I0321 05:50:12.160570 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7d5b654677-cwqcf_1d9a7806-fa8d-4106-9241-a32afafc5eb7/webhook-server/0.log" Mar 21 05:50:12 crc kubenswrapper[4580]: I0321 05:50:12.181495 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-smqrq_d2b75e08-f7c9-47c8-9b08-f574bb92461d/kube-rbac-proxy/0.log" Mar 21 05:50:12 crc kubenswrapper[4580]: I0321 05:50:12.565087 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-smqrq_d2b75e08-f7c9-47c8-9b08-f574bb92461d/speaker/0.log" Mar 21 05:50:15 crc kubenswrapper[4580]: I0321 05:50:15.369078 4580 scope.go:117] "RemoveContainer" containerID="e4639720487461648ce28a72053f5fac482eda9612d2425823ae586a6c1dbcd7" Mar 21 05:50:15 crc kubenswrapper[4580]: I0321 05:50:15.624283 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:50:15 crc kubenswrapper[4580]: E0321 05:50:15.624576 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:50:25 crc kubenswrapper[4580]: I0321 05:50:25.902830 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb_20e4a1fa-c6ce-4a58-a9b9-a982a19c5243/util/0.log" Mar 21 05:50:26 crc kubenswrapper[4580]: I0321 05:50:26.147142 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb_20e4a1fa-c6ce-4a58-a9b9-a982a19c5243/util/0.log" Mar 21 05:50:26 crc kubenswrapper[4580]: I0321 05:50:26.151246 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb_20e4a1fa-c6ce-4a58-a9b9-a982a19c5243/pull/0.log" Mar 21 05:50:26 crc kubenswrapper[4580]: I0321 05:50:26.218201 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb_20e4a1fa-c6ce-4a58-a9b9-a982a19c5243/pull/0.log" Mar 21 05:50:26 crc kubenswrapper[4580]: I0321 05:50:26.404696 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb_20e4a1fa-c6ce-4a58-a9b9-a982a19c5243/extract/0.log" Mar 21 05:50:26 crc kubenswrapper[4580]: I0321 05:50:26.410156 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb_20e4a1fa-c6ce-4a58-a9b9-a982a19c5243/pull/0.log" Mar 21 05:50:26 crc kubenswrapper[4580]: I0321 05:50:26.429570 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb_20e4a1fa-c6ce-4a58-a9b9-a982a19c5243/util/0.log" Mar 21 05:50:26 crc kubenswrapper[4580]: I0321 05:50:26.580097 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv_66538d86-0387-4c8b-b266-4ee60f1cbd90/util/0.log" Mar 21 05:50:26 crc kubenswrapper[4580]: I0321 05:50:26.617907 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:50:26 crc kubenswrapper[4580]: E0321 05:50:26.618172 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:50:26 crc kubenswrapper[4580]: I0321 05:50:26.794004 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv_66538d86-0387-4c8b-b266-4ee60f1cbd90/util/0.log" Mar 21 05:50:26 crc kubenswrapper[4580]: I0321 05:50:26.797266 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv_66538d86-0387-4c8b-b266-4ee60f1cbd90/pull/0.log" Mar 21 05:50:26 crc kubenswrapper[4580]: I0321 05:50:26.811024 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv_66538d86-0387-4c8b-b266-4ee60f1cbd90/pull/0.log" Mar 21 05:50:26 crc kubenswrapper[4580]: I0321 05:50:26.995263 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv_66538d86-0387-4c8b-b266-4ee60f1cbd90/util/0.log" Mar 21 05:50:27 crc kubenswrapper[4580]: I0321 05:50:27.003138 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv_66538d86-0387-4c8b-b266-4ee60f1cbd90/extract/0.log" Mar 21 05:50:27 crc kubenswrapper[4580]: I0321 05:50:27.054134 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv_66538d86-0387-4c8b-b266-4ee60f1cbd90/pull/0.log" Mar 21 05:50:27 crc kubenswrapper[4580]: I0321 05:50:27.203412 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ppl4t_371f2b05-72f1-4289-b499-4490d84d0d38/extract-utilities/0.log" Mar 21 05:50:27 crc kubenswrapper[4580]: I0321 05:50:27.347438 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ppl4t_371f2b05-72f1-4289-b499-4490d84d0d38/extract-content/0.log" Mar 21 05:50:27 crc kubenswrapper[4580]: I0321 05:50:27.430437 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ppl4t_371f2b05-72f1-4289-b499-4490d84d0d38/extract-utilities/0.log" Mar 21 05:50:27 crc kubenswrapper[4580]: I0321 05:50:27.449442 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ppl4t_371f2b05-72f1-4289-b499-4490d84d0d38/extract-content/0.log" Mar 21 05:50:27 crc kubenswrapper[4580]: I0321 05:50:27.676434 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ppl4t_371f2b05-72f1-4289-b499-4490d84d0d38/extract-content/0.log" Mar 21 05:50:27 crc kubenswrapper[4580]: I0321 05:50:27.690506 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ppl4t_371f2b05-72f1-4289-b499-4490d84d0d38/extract-utilities/0.log" Mar 21 05:50:28 crc kubenswrapper[4580]: I0321 05:50:28.038934 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gbbtm_fa2c2b42-95d7-47dd-b5c3-47ef8689c50c/extract-utilities/0.log" Mar 21 05:50:28 crc kubenswrapper[4580]: I0321 05:50:28.170754 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ppl4t_371f2b05-72f1-4289-b499-4490d84d0d38/registry-server/0.log" Mar 21 05:50:28 crc kubenswrapper[4580]: I0321 05:50:28.213322 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gbbtm_fa2c2b42-95d7-47dd-b5c3-47ef8689c50c/extract-utilities/0.log" Mar 21 05:50:28 crc kubenswrapper[4580]: I0321 05:50:28.273653 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gbbtm_fa2c2b42-95d7-47dd-b5c3-47ef8689c50c/extract-content/0.log" Mar 21 05:50:28 crc kubenswrapper[4580]: I0321 05:50:28.345930 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gbbtm_fa2c2b42-95d7-47dd-b5c3-47ef8689c50c/extract-content/0.log" Mar 21 05:50:28 crc kubenswrapper[4580]: I0321 05:50:28.532477 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gbbtm_fa2c2b42-95d7-47dd-b5c3-47ef8689c50c/extract-content/0.log" Mar 21 05:50:28 crc kubenswrapper[4580]: I0321 05:50:28.585562 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gbbtm_fa2c2b42-95d7-47dd-b5c3-47ef8689c50c/extract-utilities/0.log" Mar 21 05:50:28 crc kubenswrapper[4580]: I0321 05:50:28.823039 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-p95jn_6ad5649a-1bef-41a6-aeaa-73f2850df16a/marketplace-operator/0.log" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.025252 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gbbtm_fa2c2b42-95d7-47dd-b5c3-47ef8689c50c/registry-server/0.log" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.029727 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ggsnm_2307470f-41d9-48a7-bfce-d96d8a13c568/extract-utilities/0.log" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.237218 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ggsnm_2307470f-41d9-48a7-bfce-d96d8a13c568/extract-content/0.log" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.237696 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ggsnm_2307470f-41d9-48a7-bfce-d96d8a13c568/extract-utilities/0.log" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.297075 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ggsnm_2307470f-41d9-48a7-bfce-d96d8a13c568/extract-content/0.log" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.456333 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ggsnm_2307470f-41d9-48a7-bfce-d96d8a13c568/extract-content/0.log" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.503751 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ggsnm_2307470f-41d9-48a7-bfce-d96d8a13c568/extract-utilities/0.log" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.648814 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ggsnm_2307470f-41d9-48a7-bfce-d96d8a13c568/registry-server/0.log" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.704172 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmvk5_48b0cf7d-100c-4953-88d7-c4775e45c45d/extract-utilities/0.log" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.907564 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmvk5_48b0cf7d-100c-4953-88d7-c4775e45c45d/extract-content/0.log" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.925990 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmvk5_48b0cf7d-100c-4953-88d7-c4775e45c45d/extract-content/0.log" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.948632 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f4z8g"] Mar 21 05:50:29 crc kubenswrapper[4580]: E0321 05:50:29.949444 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd778e7-6536-464a-8ce1-e9a26c7d8c58" containerName="oc" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.949464 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd778e7-6536-464a-8ce1-e9a26c7d8c58" containerName="oc" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.949943 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd778e7-6536-464a-8ce1-e9a26c7d8c58" containerName="oc" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.990005 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:29 crc kubenswrapper[4580]: I0321 05:50:29.990751 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f4z8g"] Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.003854 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmvk5_48b0cf7d-100c-4953-88d7-c4775e45c45d/extract-utilities/0.log" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.047538 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-catalog-content\") pod \"certified-operators-f4z8g\" (UID: \"f67d08fa-2f05-48c0-b80b-3a1f3caae64b\") " pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.047619 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-utilities\") pod \"certified-operators-f4z8g\" (UID: \"f67d08fa-2f05-48c0-b80b-3a1f3caae64b\") " pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.047684 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cslms\" (UniqueName: \"kubernetes.io/projected/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-kube-api-access-cslms\") pod \"certified-operators-f4z8g\" (UID: \"f67d08fa-2f05-48c0-b80b-3a1f3caae64b\") " pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.149317 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-catalog-content\") pod \"certified-operators-f4z8g\" (UID: \"f67d08fa-2f05-48c0-b80b-3a1f3caae64b\") " pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.149388 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-utilities\") pod \"certified-operators-f4z8g\" (UID: \"f67d08fa-2f05-48c0-b80b-3a1f3caae64b\") " pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.149440 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cslms\" (UniqueName: \"kubernetes.io/projected/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-kube-api-access-cslms\") pod \"certified-operators-f4z8g\" (UID: \"f67d08fa-2f05-48c0-b80b-3a1f3caae64b\") " pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.149914 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-catalog-content\") pod \"certified-operators-f4z8g\" (UID: \"f67d08fa-2f05-48c0-b80b-3a1f3caae64b\") " pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.150154 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-utilities\") pod \"certified-operators-f4z8g\" (UID: \"f67d08fa-2f05-48c0-b80b-3a1f3caae64b\") " pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.178112 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cslms\" (UniqueName: \"kubernetes.io/projected/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-kube-api-access-cslms\") pod \"certified-operators-f4z8g\" (UID: \"f67d08fa-2f05-48c0-b80b-3a1f3caae64b\") " pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.281176 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmvk5_48b0cf7d-100c-4953-88d7-c4775e45c45d/extract-content/0.log" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.289017 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmvk5_48b0cf7d-100c-4953-88d7-c4775e45c45d/extract-utilities/0.log" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.329236 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.737897 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f4z8g"] Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.935582 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmvk5_48b0cf7d-100c-4953-88d7-c4775e45c45d/registry-server/0.log" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.953000 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z5rbr"] Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.954898 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.977062 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-catalog-content\") pod \"redhat-marketplace-z5rbr\" (UID: \"956484f4-be5d-4aa2-a9d7-a5b15f1da93a\") " pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.977198 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-utilities\") pod \"redhat-marketplace-z5rbr\" (UID: \"956484f4-be5d-4aa2-a9d7-a5b15f1da93a\") " pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.977328 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7mxr\" (UniqueName: \"kubernetes.io/projected/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-kube-api-access-v7mxr\") pod \"redhat-marketplace-z5rbr\" (UID: \"956484f4-be5d-4aa2-a9d7-a5b15f1da93a\") " pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.988357 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4z8g" event={"ID":"f67d08fa-2f05-48c0-b80b-3a1f3caae64b","Type":"ContainerStarted","Data":"4face93dcb74c9a008ad3d6aa56b70aa8839a2166d7c98133b5bb7170881e2ca"} Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.988395 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4z8g" event={"ID":"f67d08fa-2f05-48c0-b80b-3a1f3caae64b","Type":"ContainerStarted","Data":"f27befd2fe7ad9c104f9e42fa20fd4a1fbd5906e03ab302fa3ab81dd6c3dc641"} Mar 21 05:50:30 crc kubenswrapper[4580]: I0321 05:50:30.996311 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5rbr"] Mar 21 05:50:31 crc kubenswrapper[4580]: I0321 05:50:31.089680 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-catalog-content\") pod \"redhat-marketplace-z5rbr\" (UID: \"956484f4-be5d-4aa2-a9d7-a5b15f1da93a\") " pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:31 crc kubenswrapper[4580]: I0321 05:50:31.089834 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-utilities\") pod \"redhat-marketplace-z5rbr\" (UID: \"956484f4-be5d-4aa2-a9d7-a5b15f1da93a\") " pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:31 crc kubenswrapper[4580]: I0321 05:50:31.089977 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7mxr\" (UniqueName: \"kubernetes.io/projected/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-kube-api-access-v7mxr\") pod \"redhat-marketplace-z5rbr\" (UID: \"956484f4-be5d-4aa2-a9d7-a5b15f1da93a\") " pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:31 crc kubenswrapper[4580]: I0321 05:50:31.090338 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-catalog-content\") pod \"redhat-marketplace-z5rbr\" (UID: \"956484f4-be5d-4aa2-a9d7-a5b15f1da93a\") " pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:31 crc kubenswrapper[4580]: I0321 05:50:31.090700 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-utilities\") pod \"redhat-marketplace-z5rbr\" (UID: \"956484f4-be5d-4aa2-a9d7-a5b15f1da93a\") " pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:31 crc kubenswrapper[4580]: I0321 05:50:31.109268 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7mxr\" (UniqueName: \"kubernetes.io/projected/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-kube-api-access-v7mxr\") pod \"redhat-marketplace-z5rbr\" (UID: \"956484f4-be5d-4aa2-a9d7-a5b15f1da93a\") " pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:31 crc kubenswrapper[4580]: E0321 05:50:31.198273 4580 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf67d08fa_2f05_48c0_b80b_3a1f3caae64b.slice/crio-4face93dcb74c9a008ad3d6aa56b70aa8839a2166d7c98133b5bb7170881e2ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf67d08fa_2f05_48c0_b80b_3a1f3caae64b.slice/crio-conmon-4face93dcb74c9a008ad3d6aa56b70aa8839a2166d7c98133b5bb7170881e2ca.scope\": RecentStats: unable to find data in memory cache]" Mar 21 05:50:31 crc kubenswrapper[4580]: I0321 05:50:31.291571 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:31 crc kubenswrapper[4580]: I0321 05:50:31.866620 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5rbr"] Mar 21 05:50:32 crc kubenswrapper[4580]: I0321 05:50:32.001695 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5rbr" event={"ID":"956484f4-be5d-4aa2-a9d7-a5b15f1da93a","Type":"ContainerStarted","Data":"c80fc29511fd706611c6c1800495cdec72c5291577eddf181d03412561221cba"} Mar 21 05:50:32 crc kubenswrapper[4580]: I0321 05:50:32.003308 4580 generic.go:334] "Generic (PLEG): container finished" podID="f67d08fa-2f05-48c0-b80b-3a1f3caae64b" containerID="4face93dcb74c9a008ad3d6aa56b70aa8839a2166d7c98133b5bb7170881e2ca" exitCode=0 Mar 21 05:50:32 crc kubenswrapper[4580]: I0321 05:50:32.003336 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4z8g" event={"ID":"f67d08fa-2f05-48c0-b80b-3a1f3caae64b","Type":"ContainerDied","Data":"4face93dcb74c9a008ad3d6aa56b70aa8839a2166d7c98133b5bb7170881e2ca"} Mar 21 05:50:33 crc kubenswrapper[4580]: I0321 05:50:33.013302 4580 generic.go:334] "Generic (PLEG): container finished" podID="956484f4-be5d-4aa2-a9d7-a5b15f1da93a" containerID="e18a31587337dcd52bd75a79841bd68ca0da7519850a5e1947709bc46a2eb799" exitCode=0 Mar 21 05:50:33 crc kubenswrapper[4580]: I0321 05:50:33.013343 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5rbr" event={"ID":"956484f4-be5d-4aa2-a9d7-a5b15f1da93a","Type":"ContainerDied","Data":"e18a31587337dcd52bd75a79841bd68ca0da7519850a5e1947709bc46a2eb799"} Mar 21 05:50:33 crc kubenswrapper[4580]: I0321 05:50:33.016795 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4z8g" event={"ID":"f67d08fa-2f05-48c0-b80b-3a1f3caae64b","Type":"ContainerStarted","Data":"cdbb3ebaa672afedf8c9f725012e72295f6ee9a7c0ec98e72dbeb5894cf75ef3"} Mar 21 05:50:34 crc kubenswrapper[4580]: I0321 05:50:34.028467 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5rbr" event={"ID":"956484f4-be5d-4aa2-a9d7-a5b15f1da93a","Type":"ContainerStarted","Data":"e213f6e34a89967090a14b40347faa06d1ae21d4ab1e770bf79888eb32c54f07"} Mar 21 05:50:35 crc kubenswrapper[4580]: I0321 05:50:35.039666 4580 generic.go:334] "Generic (PLEG): container finished" podID="f67d08fa-2f05-48c0-b80b-3a1f3caae64b" containerID="cdbb3ebaa672afedf8c9f725012e72295f6ee9a7c0ec98e72dbeb5894cf75ef3" exitCode=0 Mar 21 05:50:35 crc kubenswrapper[4580]: I0321 05:50:35.039879 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4z8g" event={"ID":"f67d08fa-2f05-48c0-b80b-3a1f3caae64b","Type":"ContainerDied","Data":"cdbb3ebaa672afedf8c9f725012e72295f6ee9a7c0ec98e72dbeb5894cf75ef3"} Mar 21 05:50:35 crc kubenswrapper[4580]: I0321 05:50:35.047366 4580 generic.go:334] "Generic (PLEG): container finished" podID="956484f4-be5d-4aa2-a9d7-a5b15f1da93a" containerID="e213f6e34a89967090a14b40347faa06d1ae21d4ab1e770bf79888eb32c54f07" exitCode=0 Mar 21 05:50:35 crc kubenswrapper[4580]: I0321 05:50:35.047398 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5rbr" event={"ID":"956484f4-be5d-4aa2-a9d7-a5b15f1da93a","Type":"ContainerDied","Data":"e213f6e34a89967090a14b40347faa06d1ae21d4ab1e770bf79888eb32c54f07"} Mar 21 05:50:36 crc kubenswrapper[4580]: I0321 05:50:36.055640 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5rbr" event={"ID":"956484f4-be5d-4aa2-a9d7-a5b15f1da93a","Type":"ContainerStarted","Data":"f9b204b3eb2d37e9304d339f070486ca497f4d39532a0685dc2524ba3f95e0b2"} Mar 21 05:50:36 crc kubenswrapper[4580]: I0321 05:50:36.060509 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4z8g" event={"ID":"f67d08fa-2f05-48c0-b80b-3a1f3caae64b","Type":"ContainerStarted","Data":"1f78d2990e5621d499ed84708e11d2bc0ba05617aff84e3886e0b410124f5812"} Mar 21 05:50:36 crc kubenswrapper[4580]: I0321 05:50:36.079877 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z5rbr" podStartSLOduration=3.672671969 podStartE2EDuration="6.079856594s" podCreationTimestamp="2026-03-21 05:50:30 +0000 UTC" firstStartedPulling="2026-03-21 05:50:33.0176359 +0000 UTC m=+3538.100219538" lastFinishedPulling="2026-03-21 05:50:35.424820535 +0000 UTC m=+3540.507404163" observedRunningTime="2026-03-21 05:50:36.070105162 +0000 UTC m=+3541.152688800" watchObservedRunningTime="2026-03-21 05:50:36.079856594 +0000 UTC m=+3541.162440222" Mar 21 05:50:36 crc kubenswrapper[4580]: I0321 05:50:36.100859 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f4z8g" podStartSLOduration=3.6710811420000002 podStartE2EDuration="7.100843229s" podCreationTimestamp="2026-03-21 05:50:29 +0000 UTC" firstStartedPulling="2026-03-21 05:50:32.005861329 +0000 UTC m=+3537.088444957" lastFinishedPulling="2026-03-21 05:50:35.435623426 +0000 UTC m=+3540.518207044" observedRunningTime="2026-03-21 05:50:36.091367824 +0000 UTC m=+3541.173951462" watchObservedRunningTime="2026-03-21 05:50:36.100843229 +0000 UTC m=+3541.183426857" Mar 21 05:50:37 crc kubenswrapper[4580]: I0321 05:50:37.618438 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:50:37 crc kubenswrapper[4580]: E0321 05:50:37.619913 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:50:40 crc kubenswrapper[4580]: I0321 05:50:40.330046 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:40 crc kubenswrapper[4580]: I0321 05:50:40.330383 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:40 crc kubenswrapper[4580]: I0321 05:50:40.371566 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:41 crc kubenswrapper[4580]: I0321 05:50:41.172031 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:41 crc kubenswrapper[4580]: I0321 05:50:41.291804 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:41 crc kubenswrapper[4580]: I0321 05:50:41.292270 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:41 crc kubenswrapper[4580]: I0321 05:50:41.348676 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:41 crc kubenswrapper[4580]: I0321 05:50:41.939628 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f4z8g"] Mar 21 05:50:42 crc kubenswrapper[4580]: I0321 05:50:42.161742 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:43 crc kubenswrapper[4580]: I0321 05:50:43.148673 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f4z8g" podUID="f67d08fa-2f05-48c0-b80b-3a1f3caae64b" containerName="registry-server" containerID="cri-o://1f78d2990e5621d499ed84708e11d2bc0ba05617aff84e3886e0b410124f5812" gracePeriod=2 Mar 21 05:50:43 crc kubenswrapper[4580]: I0321 05:50:43.732148 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5rbr"] Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.154669 4580 generic.go:334] "Generic (PLEG): container finished" podID="f67d08fa-2f05-48c0-b80b-3a1f3caae64b" containerID="1f78d2990e5621d499ed84708e11d2bc0ba05617aff84e3886e0b410124f5812" exitCode=0 Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.154907 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z5rbr" podUID="956484f4-be5d-4aa2-a9d7-a5b15f1da93a" containerName="registry-server" containerID="cri-o://f9b204b3eb2d37e9304d339f070486ca497f4d39532a0685dc2524ba3f95e0b2" gracePeriod=2 Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.154895 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4z8g" event={"ID":"f67d08fa-2f05-48c0-b80b-3a1f3caae64b","Type":"ContainerDied","Data":"1f78d2990e5621d499ed84708e11d2bc0ba05617aff84e3886e0b410124f5812"} Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.155042 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4z8g" event={"ID":"f67d08fa-2f05-48c0-b80b-3a1f3caae64b","Type":"ContainerDied","Data":"f27befd2fe7ad9c104f9e42fa20fd4a1fbd5906e03ab302fa3ab81dd6c3dc641"} Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.155065 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f27befd2fe7ad9c104f9e42fa20fd4a1fbd5906e03ab302fa3ab81dd6c3dc641" Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.225660 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.314813 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cslms\" (UniqueName: \"kubernetes.io/projected/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-kube-api-access-cslms\") pod \"f67d08fa-2f05-48c0-b80b-3a1f3caae64b\" (UID: \"f67d08fa-2f05-48c0-b80b-3a1f3caae64b\") " Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.314956 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-catalog-content\") pod \"f67d08fa-2f05-48c0-b80b-3a1f3caae64b\" (UID: \"f67d08fa-2f05-48c0-b80b-3a1f3caae64b\") " Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.315134 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-utilities\") pod \"f67d08fa-2f05-48c0-b80b-3a1f3caae64b\" (UID: \"f67d08fa-2f05-48c0-b80b-3a1f3caae64b\") " Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.316932 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-utilities" (OuterVolumeSpecName: "utilities") pod "f67d08fa-2f05-48c0-b80b-3a1f3caae64b" (UID: "f67d08fa-2f05-48c0-b80b-3a1f3caae64b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.325206 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-kube-api-access-cslms" (OuterVolumeSpecName: "kube-api-access-cslms") pod "f67d08fa-2f05-48c0-b80b-3a1f3caae64b" (UID: "f67d08fa-2f05-48c0-b80b-3a1f3caae64b"). InnerVolumeSpecName "kube-api-access-cslms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.374064 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f67d08fa-2f05-48c0-b80b-3a1f3caae64b" (UID: "f67d08fa-2f05-48c0-b80b-3a1f3caae64b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.417074 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cslms\" (UniqueName: \"kubernetes.io/projected/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-kube-api-access-cslms\") on node \"crc\" DevicePath \"\"" Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.417363 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.417372 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f67d08fa-2f05-48c0-b80b-3a1f3caae64b-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.663885 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.733091 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-utilities\") pod \"956484f4-be5d-4aa2-a9d7-a5b15f1da93a\" (UID: \"956484f4-be5d-4aa2-a9d7-a5b15f1da93a\") " Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.733142 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7mxr\" (UniqueName: \"kubernetes.io/projected/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-kube-api-access-v7mxr\") pod \"956484f4-be5d-4aa2-a9d7-a5b15f1da93a\" (UID: \"956484f4-be5d-4aa2-a9d7-a5b15f1da93a\") " Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.733198 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-catalog-content\") pod \"956484f4-be5d-4aa2-a9d7-a5b15f1da93a\" (UID: \"956484f4-be5d-4aa2-a9d7-a5b15f1da93a\") " Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.733964 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-utilities" (OuterVolumeSpecName: "utilities") pod "956484f4-be5d-4aa2-a9d7-a5b15f1da93a" (UID: "956484f4-be5d-4aa2-a9d7-a5b15f1da93a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.736742 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-kube-api-access-v7mxr" (OuterVolumeSpecName: "kube-api-access-v7mxr") pod "956484f4-be5d-4aa2-a9d7-a5b15f1da93a" (UID: "956484f4-be5d-4aa2-a9d7-a5b15f1da93a"). InnerVolumeSpecName "kube-api-access-v7mxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.760370 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "956484f4-be5d-4aa2-a9d7-a5b15f1da93a" (UID: "956484f4-be5d-4aa2-a9d7-a5b15f1da93a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.836110 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.836147 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7mxr\" (UniqueName: \"kubernetes.io/projected/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-kube-api-access-v7mxr\") on node \"crc\" DevicePath \"\"" Mar 21 05:50:44 crc kubenswrapper[4580]: I0321 05:50:44.836158 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956484f4-be5d-4aa2-a9d7-a5b15f1da93a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.165289 4580 generic.go:334] "Generic (PLEG): container finished" podID="956484f4-be5d-4aa2-a9d7-a5b15f1da93a" containerID="f9b204b3eb2d37e9304d339f070486ca497f4d39532a0685dc2524ba3f95e0b2" exitCode=0 Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.165370 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5rbr" Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.165391 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4z8g" Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.165386 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5rbr" event={"ID":"956484f4-be5d-4aa2-a9d7-a5b15f1da93a","Type":"ContainerDied","Data":"f9b204b3eb2d37e9304d339f070486ca497f4d39532a0685dc2524ba3f95e0b2"} Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.165441 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5rbr" event={"ID":"956484f4-be5d-4aa2-a9d7-a5b15f1da93a","Type":"ContainerDied","Data":"c80fc29511fd706611c6c1800495cdec72c5291577eddf181d03412561221cba"} Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.165464 4580 scope.go:117] "RemoveContainer" containerID="f9b204b3eb2d37e9304d339f070486ca497f4d39532a0685dc2524ba3f95e0b2" Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.215887 4580 scope.go:117] "RemoveContainer" containerID="e213f6e34a89967090a14b40347faa06d1ae21d4ab1e770bf79888eb32c54f07" Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.217701 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f4z8g"] Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.229246 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f4z8g"] Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.240443 4580 scope.go:117] "RemoveContainer" containerID="e18a31587337dcd52bd75a79841bd68ca0da7519850a5e1947709bc46a2eb799" Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.240584 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5rbr"] Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.251770 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5rbr"] Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.274991 4580 scope.go:117] "RemoveContainer" containerID="f9b204b3eb2d37e9304d339f070486ca497f4d39532a0685dc2524ba3f95e0b2" Mar 21 05:50:45 crc kubenswrapper[4580]: E0321 05:50:45.275872 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9b204b3eb2d37e9304d339f070486ca497f4d39532a0685dc2524ba3f95e0b2\": container with ID starting with f9b204b3eb2d37e9304d339f070486ca497f4d39532a0685dc2524ba3f95e0b2 not found: ID does not exist" containerID="f9b204b3eb2d37e9304d339f070486ca497f4d39532a0685dc2524ba3f95e0b2" Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.275904 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9b204b3eb2d37e9304d339f070486ca497f4d39532a0685dc2524ba3f95e0b2"} err="failed to get container status \"f9b204b3eb2d37e9304d339f070486ca497f4d39532a0685dc2524ba3f95e0b2\": rpc error: code = NotFound desc = could not find container \"f9b204b3eb2d37e9304d339f070486ca497f4d39532a0685dc2524ba3f95e0b2\": container with ID starting with f9b204b3eb2d37e9304d339f070486ca497f4d39532a0685dc2524ba3f95e0b2 not found: ID does not exist" Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.275927 4580 scope.go:117] "RemoveContainer" containerID="e213f6e34a89967090a14b40347faa06d1ae21d4ab1e770bf79888eb32c54f07" Mar 21 05:50:45 crc kubenswrapper[4580]: E0321 05:50:45.276969 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e213f6e34a89967090a14b40347faa06d1ae21d4ab1e770bf79888eb32c54f07\": container with ID starting with e213f6e34a89967090a14b40347faa06d1ae21d4ab1e770bf79888eb32c54f07 not found: ID does not exist" containerID="e213f6e34a89967090a14b40347faa06d1ae21d4ab1e770bf79888eb32c54f07" Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.277023 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e213f6e34a89967090a14b40347faa06d1ae21d4ab1e770bf79888eb32c54f07"} err="failed to get container status \"e213f6e34a89967090a14b40347faa06d1ae21d4ab1e770bf79888eb32c54f07\": rpc error: code = NotFound desc = could not find container \"e213f6e34a89967090a14b40347faa06d1ae21d4ab1e770bf79888eb32c54f07\": container with ID starting with e213f6e34a89967090a14b40347faa06d1ae21d4ab1e770bf79888eb32c54f07 not found: ID does not exist" Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.277053 4580 scope.go:117] "RemoveContainer" containerID="e18a31587337dcd52bd75a79841bd68ca0da7519850a5e1947709bc46a2eb799" Mar 21 05:50:45 crc kubenswrapper[4580]: E0321 05:50:45.277667 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e18a31587337dcd52bd75a79841bd68ca0da7519850a5e1947709bc46a2eb799\": container with ID starting with e18a31587337dcd52bd75a79841bd68ca0da7519850a5e1947709bc46a2eb799 not found: ID does not exist" containerID="e18a31587337dcd52bd75a79841bd68ca0da7519850a5e1947709bc46a2eb799" Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.277698 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e18a31587337dcd52bd75a79841bd68ca0da7519850a5e1947709bc46a2eb799"} err="failed to get container status \"e18a31587337dcd52bd75a79841bd68ca0da7519850a5e1947709bc46a2eb799\": rpc error: code = NotFound desc = could not find container \"e18a31587337dcd52bd75a79841bd68ca0da7519850a5e1947709bc46a2eb799\": container with ID starting with e18a31587337dcd52bd75a79841bd68ca0da7519850a5e1947709bc46a2eb799 not found: ID does not exist" Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.633029 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="956484f4-be5d-4aa2-a9d7-a5b15f1da93a" path="/var/lib/kubelet/pods/956484f4-be5d-4aa2-a9d7-a5b15f1da93a/volumes" Mar 21 05:50:45 crc kubenswrapper[4580]: I0321 05:50:45.634769 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f67d08fa-2f05-48c0-b80b-3a1f3caae64b" path="/var/lib/kubelet/pods/f67d08fa-2f05-48c0-b80b-3a1f3caae64b/volumes" Mar 21 05:50:51 crc kubenswrapper[4580]: I0321 05:50:51.617507 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:50:52 crc kubenswrapper[4580]: I0321 05:50:52.230261 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"d3bcda5a760c6a54453a6b1a9b32ee5ceff6b3f290960e7e502c467bc51c6266"} Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.169865 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567872-fhqzx"] Mar 21 05:52:00 crc kubenswrapper[4580]: E0321 05:52:00.170993 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f67d08fa-2f05-48c0-b80b-3a1f3caae64b" containerName="extract-utilities" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.171012 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67d08fa-2f05-48c0-b80b-3a1f3caae64b" containerName="extract-utilities" Mar 21 05:52:00 crc kubenswrapper[4580]: E0321 05:52:00.171036 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956484f4-be5d-4aa2-a9d7-a5b15f1da93a" containerName="registry-server" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.171043 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="956484f4-be5d-4aa2-a9d7-a5b15f1da93a" containerName="registry-server" Mar 21 05:52:00 crc kubenswrapper[4580]: E0321 05:52:00.171057 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956484f4-be5d-4aa2-a9d7-a5b15f1da93a" containerName="extract-utilities" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.171065 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="956484f4-be5d-4aa2-a9d7-a5b15f1da93a" containerName="extract-utilities" Mar 21 05:52:00 crc kubenswrapper[4580]: E0321 05:52:00.171084 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956484f4-be5d-4aa2-a9d7-a5b15f1da93a" containerName="extract-content" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.171093 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="956484f4-be5d-4aa2-a9d7-a5b15f1da93a" containerName="extract-content" Mar 21 05:52:00 crc kubenswrapper[4580]: E0321 05:52:00.171109 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f67d08fa-2f05-48c0-b80b-3a1f3caae64b" containerName="registry-server" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.171116 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67d08fa-2f05-48c0-b80b-3a1f3caae64b" containerName="registry-server" Mar 21 05:52:00 crc kubenswrapper[4580]: E0321 05:52:00.171135 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f67d08fa-2f05-48c0-b80b-3a1f3caae64b" containerName="extract-content" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.171142 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67d08fa-2f05-48c0-b80b-3a1f3caae64b" containerName="extract-content" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.171412 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f67d08fa-2f05-48c0-b80b-3a1f3caae64b" containerName="registry-server" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.171441 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="956484f4-be5d-4aa2-a9d7-a5b15f1da93a" containerName="registry-server" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.172286 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567872-fhqzx" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.176008 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.176299 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.176487 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.206109 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567872-fhqzx"] Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.275944 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b8sr\" (UniqueName: \"kubernetes.io/projected/16298bac-6153-4f68-8eb4-551cde77ea48-kube-api-access-5b8sr\") pod \"auto-csr-approver-29567872-fhqzx\" (UID: \"16298bac-6153-4f68-8eb4-551cde77ea48\") " pod="openshift-infra/auto-csr-approver-29567872-fhqzx" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.377881 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b8sr\" (UniqueName: \"kubernetes.io/projected/16298bac-6153-4f68-8eb4-551cde77ea48-kube-api-access-5b8sr\") pod \"auto-csr-approver-29567872-fhqzx\" (UID: \"16298bac-6153-4f68-8eb4-551cde77ea48\") " pod="openshift-infra/auto-csr-approver-29567872-fhqzx" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.410108 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b8sr\" (UniqueName: \"kubernetes.io/projected/16298bac-6153-4f68-8eb4-551cde77ea48-kube-api-access-5b8sr\") pod \"auto-csr-approver-29567872-fhqzx\" (UID: \"16298bac-6153-4f68-8eb4-551cde77ea48\") " pod="openshift-infra/auto-csr-approver-29567872-fhqzx" Mar 21 05:52:00 crc kubenswrapper[4580]: I0321 05:52:00.500678 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567872-fhqzx" Mar 21 05:52:01 crc kubenswrapper[4580]: I0321 05:52:01.013899 4580 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:52:01 crc kubenswrapper[4580]: I0321 05:52:01.024165 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567872-fhqzx"] Mar 21 05:52:01 crc kubenswrapper[4580]: I0321 05:52:01.883327 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567872-fhqzx" event={"ID":"16298bac-6153-4f68-8eb4-551cde77ea48","Type":"ContainerStarted","Data":"588c6d0d434514983bd59717cca93982b4e41364324ad1f1622ce3d8848428c9"} Mar 21 05:52:02 crc kubenswrapper[4580]: I0321 05:52:02.900031 4580 generic.go:334] "Generic (PLEG): container finished" podID="16298bac-6153-4f68-8eb4-551cde77ea48" containerID="34cf123f49cc5b86f6dd1f47ef6222c35c002b386517787bbc0b2033f3312e54" exitCode=0 Mar 21 05:52:02 crc kubenswrapper[4580]: I0321 05:52:02.900135 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567872-fhqzx" event={"ID":"16298bac-6153-4f68-8eb4-551cde77ea48","Type":"ContainerDied","Data":"34cf123f49cc5b86f6dd1f47ef6222c35c002b386517787bbc0b2033f3312e54"} Mar 21 05:52:04 crc kubenswrapper[4580]: I0321 05:52:04.275493 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567872-fhqzx" Mar 21 05:52:04 crc kubenswrapper[4580]: I0321 05:52:04.372405 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b8sr\" (UniqueName: \"kubernetes.io/projected/16298bac-6153-4f68-8eb4-551cde77ea48-kube-api-access-5b8sr\") pod \"16298bac-6153-4f68-8eb4-551cde77ea48\" (UID: \"16298bac-6153-4f68-8eb4-551cde77ea48\") " Mar 21 05:52:04 crc kubenswrapper[4580]: I0321 05:52:04.378666 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16298bac-6153-4f68-8eb4-551cde77ea48-kube-api-access-5b8sr" (OuterVolumeSpecName: "kube-api-access-5b8sr") pod "16298bac-6153-4f68-8eb4-551cde77ea48" (UID: "16298bac-6153-4f68-8eb4-551cde77ea48"). InnerVolumeSpecName "kube-api-access-5b8sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:52:04 crc kubenswrapper[4580]: I0321 05:52:04.475444 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b8sr\" (UniqueName: \"kubernetes.io/projected/16298bac-6153-4f68-8eb4-551cde77ea48-kube-api-access-5b8sr\") on node \"crc\" DevicePath \"\"" Mar 21 05:52:04 crc kubenswrapper[4580]: I0321 05:52:04.917858 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567872-fhqzx" event={"ID":"16298bac-6153-4f68-8eb4-551cde77ea48","Type":"ContainerDied","Data":"588c6d0d434514983bd59717cca93982b4e41364324ad1f1622ce3d8848428c9"} Mar 21 05:52:04 crc kubenswrapper[4580]: I0321 05:52:04.917899 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567872-fhqzx" Mar 21 05:52:04 crc kubenswrapper[4580]: I0321 05:52:04.917901 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="588c6d0d434514983bd59717cca93982b4e41364324ad1f1622ce3d8848428c9" Mar 21 05:52:05 crc kubenswrapper[4580]: I0321 05:52:05.350452 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567866-t9zwb"] Mar 21 05:52:05 crc kubenswrapper[4580]: I0321 05:52:05.361036 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567866-t9zwb"] Mar 21 05:52:05 crc kubenswrapper[4580]: I0321 05:52:05.628333 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01138593-99cf-4cf0-b9bc-c9c8fb742e20" path="/var/lib/kubelet/pods/01138593-99cf-4cf0-b9bc-c9c8fb742e20/volumes" Mar 21 05:52:15 crc kubenswrapper[4580]: I0321 05:52:15.521691 4580 scope.go:117] "RemoveContainer" containerID="a187d6d091ab7be4b5941e90c19f8202c786c4c7a7a17e9cc93e6799c52ea5f7" Mar 21 05:52:30 crc kubenswrapper[4580]: I0321 05:52:30.188180 4580 generic.go:334] "Generic (PLEG): container finished" podID="c79ccdc7-5b73-4541-8dd6-1d11172e66df" containerID="1ce1e2bca08ecc44ca9d936dae6d0014fe0f2b346d34eaa186628d26c31616a7" exitCode=0 Mar 21 05:52:30 crc kubenswrapper[4580]: I0321 05:52:30.188328 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bpk7s/must-gather-r67c9" event={"ID":"c79ccdc7-5b73-4541-8dd6-1d11172e66df","Type":"ContainerDied","Data":"1ce1e2bca08ecc44ca9d936dae6d0014fe0f2b346d34eaa186628d26c31616a7"} Mar 21 05:52:30 crc kubenswrapper[4580]: I0321 05:52:30.190119 4580 scope.go:117] "RemoveContainer" containerID="1ce1e2bca08ecc44ca9d936dae6d0014fe0f2b346d34eaa186628d26c31616a7" Mar 21 05:52:31 crc kubenswrapper[4580]: I0321 05:52:31.013330 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bpk7s_must-gather-r67c9_c79ccdc7-5b73-4541-8dd6-1d11172e66df/gather/0.log" Mar 21 05:52:39 crc kubenswrapper[4580]: I0321 05:52:39.268221 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bpk7s/must-gather-r67c9"] Mar 21 05:52:39 crc kubenswrapper[4580]: I0321 05:52:39.268997 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-bpk7s/must-gather-r67c9" podUID="c79ccdc7-5b73-4541-8dd6-1d11172e66df" containerName="copy" containerID="cri-o://ccddcdbb108133a7b96ae03a5ea07d3849496a89c2321cc44e67f855135b6ebf" gracePeriod=2 Mar 21 05:52:39 crc kubenswrapper[4580]: I0321 05:52:39.278415 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bpk7s/must-gather-r67c9"] Mar 21 05:52:39 crc kubenswrapper[4580]: I0321 05:52:39.719124 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bpk7s_must-gather-r67c9_c79ccdc7-5b73-4541-8dd6-1d11172e66df/copy/0.log" Mar 21 05:52:39 crc kubenswrapper[4580]: I0321 05:52:39.719951 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/must-gather-r67c9" Mar 21 05:52:39 crc kubenswrapper[4580]: I0321 05:52:39.740018 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68cff\" (UniqueName: \"kubernetes.io/projected/c79ccdc7-5b73-4541-8dd6-1d11172e66df-kube-api-access-68cff\") pod \"c79ccdc7-5b73-4541-8dd6-1d11172e66df\" (UID: \"c79ccdc7-5b73-4541-8dd6-1d11172e66df\") " Mar 21 05:52:39 crc kubenswrapper[4580]: I0321 05:52:39.740401 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c79ccdc7-5b73-4541-8dd6-1d11172e66df-must-gather-output\") pod \"c79ccdc7-5b73-4541-8dd6-1d11172e66df\" (UID: \"c79ccdc7-5b73-4541-8dd6-1d11172e66df\") " Mar 21 05:52:39 crc kubenswrapper[4580]: I0321 05:52:39.760550 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c79ccdc7-5b73-4541-8dd6-1d11172e66df-kube-api-access-68cff" (OuterVolumeSpecName: "kube-api-access-68cff") pod "c79ccdc7-5b73-4541-8dd6-1d11172e66df" (UID: "c79ccdc7-5b73-4541-8dd6-1d11172e66df"). InnerVolumeSpecName "kube-api-access-68cff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:52:39 crc kubenswrapper[4580]: I0321 05:52:39.842512 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68cff\" (UniqueName: \"kubernetes.io/projected/c79ccdc7-5b73-4541-8dd6-1d11172e66df-kube-api-access-68cff\") on node \"crc\" DevicePath \"\"" Mar 21 05:52:39 crc kubenswrapper[4580]: I0321 05:52:39.914183 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c79ccdc7-5b73-4541-8dd6-1d11172e66df-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c79ccdc7-5b73-4541-8dd6-1d11172e66df" (UID: "c79ccdc7-5b73-4541-8dd6-1d11172e66df"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:52:39 crc kubenswrapper[4580]: I0321 05:52:39.943924 4580 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c79ccdc7-5b73-4541-8dd6-1d11172e66df-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 21 05:52:40 crc kubenswrapper[4580]: I0321 05:52:40.288182 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bpk7s_must-gather-r67c9_c79ccdc7-5b73-4541-8dd6-1d11172e66df/copy/0.log" Mar 21 05:52:40 crc kubenswrapper[4580]: I0321 05:52:40.288648 4580 generic.go:334] "Generic (PLEG): container finished" podID="c79ccdc7-5b73-4541-8dd6-1d11172e66df" containerID="ccddcdbb108133a7b96ae03a5ea07d3849496a89c2321cc44e67f855135b6ebf" exitCode=143 Mar 21 05:52:40 crc kubenswrapper[4580]: I0321 05:52:40.288701 4580 scope.go:117] "RemoveContainer" containerID="ccddcdbb108133a7b96ae03a5ea07d3849496a89c2321cc44e67f855135b6ebf" Mar 21 05:52:40 crc kubenswrapper[4580]: I0321 05:52:40.288853 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bpk7s/must-gather-r67c9" Mar 21 05:52:40 crc kubenswrapper[4580]: I0321 05:52:40.323117 4580 scope.go:117] "RemoveContainer" containerID="1ce1e2bca08ecc44ca9d936dae6d0014fe0f2b346d34eaa186628d26c31616a7" Mar 21 05:52:40 crc kubenswrapper[4580]: I0321 05:52:40.389546 4580 scope.go:117] "RemoveContainer" containerID="ccddcdbb108133a7b96ae03a5ea07d3849496a89c2321cc44e67f855135b6ebf" Mar 21 05:52:40 crc kubenswrapper[4580]: E0321 05:52:40.389929 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccddcdbb108133a7b96ae03a5ea07d3849496a89c2321cc44e67f855135b6ebf\": container with ID starting with ccddcdbb108133a7b96ae03a5ea07d3849496a89c2321cc44e67f855135b6ebf not found: ID does not exist" containerID="ccddcdbb108133a7b96ae03a5ea07d3849496a89c2321cc44e67f855135b6ebf" Mar 21 05:52:40 crc kubenswrapper[4580]: I0321 05:52:40.389965 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccddcdbb108133a7b96ae03a5ea07d3849496a89c2321cc44e67f855135b6ebf"} err="failed to get container status \"ccddcdbb108133a7b96ae03a5ea07d3849496a89c2321cc44e67f855135b6ebf\": rpc error: code = NotFound desc = could not find container \"ccddcdbb108133a7b96ae03a5ea07d3849496a89c2321cc44e67f855135b6ebf\": container with ID starting with ccddcdbb108133a7b96ae03a5ea07d3849496a89c2321cc44e67f855135b6ebf not found: ID does not exist" Mar 21 05:52:40 crc kubenswrapper[4580]: I0321 05:52:40.389992 4580 scope.go:117] "RemoveContainer" containerID="1ce1e2bca08ecc44ca9d936dae6d0014fe0f2b346d34eaa186628d26c31616a7" Mar 21 05:52:40 crc kubenswrapper[4580]: E0321 05:52:40.390412 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ce1e2bca08ecc44ca9d936dae6d0014fe0f2b346d34eaa186628d26c31616a7\": container with ID starting with 1ce1e2bca08ecc44ca9d936dae6d0014fe0f2b346d34eaa186628d26c31616a7 not found: ID does not exist" containerID="1ce1e2bca08ecc44ca9d936dae6d0014fe0f2b346d34eaa186628d26c31616a7" Mar 21 05:52:40 crc kubenswrapper[4580]: I0321 05:52:40.390435 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce1e2bca08ecc44ca9d936dae6d0014fe0f2b346d34eaa186628d26c31616a7"} err="failed to get container status \"1ce1e2bca08ecc44ca9d936dae6d0014fe0f2b346d34eaa186628d26c31616a7\": rpc error: code = NotFound desc = could not find container \"1ce1e2bca08ecc44ca9d936dae6d0014fe0f2b346d34eaa186628d26c31616a7\": container with ID starting with 1ce1e2bca08ecc44ca9d936dae6d0014fe0f2b346d34eaa186628d26c31616a7 not found: ID does not exist" Mar 21 05:52:41 crc kubenswrapper[4580]: I0321 05:52:41.627852 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c79ccdc7-5b73-4541-8dd6-1d11172e66df" path="/var/lib/kubelet/pods/c79ccdc7-5b73-4541-8dd6-1d11172e66df/volumes" Mar 21 05:53:15 crc kubenswrapper[4580]: I0321 05:53:15.626252 4580 scope.go:117] "RemoveContainer" containerID="83ee520bd99fefe3b954fdfcbd2c1bfe32a8a7fdfa6540abf93c5a296a8f7bc5" Mar 21 05:53:15 crc kubenswrapper[4580]: I0321 05:53:15.948116 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:53:15 crc kubenswrapper[4580]: I0321 05:53:15.948547 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:53:45 crc kubenswrapper[4580]: I0321 05:53:45.947403 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:53:45 crc kubenswrapper[4580]: I0321 05:53:45.947944 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.163509 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567874-nk5ld"] Mar 21 05:54:00 crc kubenswrapper[4580]: E0321 05:54:00.164473 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16298bac-6153-4f68-8eb4-551cde77ea48" containerName="oc" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.164489 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="16298bac-6153-4f68-8eb4-551cde77ea48" containerName="oc" Mar 21 05:54:00 crc kubenswrapper[4580]: E0321 05:54:00.164516 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79ccdc7-5b73-4541-8dd6-1d11172e66df" containerName="gather" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.164523 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79ccdc7-5b73-4541-8dd6-1d11172e66df" containerName="gather" Mar 21 05:54:00 crc kubenswrapper[4580]: E0321 05:54:00.164545 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79ccdc7-5b73-4541-8dd6-1d11172e66df" containerName="copy" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.164553 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79ccdc7-5b73-4541-8dd6-1d11172e66df" containerName="copy" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.164761 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="16298bac-6153-4f68-8eb4-551cde77ea48" containerName="oc" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.164821 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="c79ccdc7-5b73-4541-8dd6-1d11172e66df" containerName="gather" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.164830 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="c79ccdc7-5b73-4541-8dd6-1d11172e66df" containerName="copy" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.165532 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567874-nk5ld" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.173219 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.173482 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.177096 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.195590 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567874-nk5ld"] Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.249210 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5w7w\" (UniqueName: \"kubernetes.io/projected/be3bba12-a082-489f-b9b2-d582337a8979-kube-api-access-m5w7w\") pod \"auto-csr-approver-29567874-nk5ld\" (UID: \"be3bba12-a082-489f-b9b2-d582337a8979\") " pod="openshift-infra/auto-csr-approver-29567874-nk5ld" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.351617 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5w7w\" (UniqueName: \"kubernetes.io/projected/be3bba12-a082-489f-b9b2-d582337a8979-kube-api-access-m5w7w\") pod \"auto-csr-approver-29567874-nk5ld\" (UID: \"be3bba12-a082-489f-b9b2-d582337a8979\") " pod="openshift-infra/auto-csr-approver-29567874-nk5ld" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.382447 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5w7w\" (UniqueName: \"kubernetes.io/projected/be3bba12-a082-489f-b9b2-d582337a8979-kube-api-access-m5w7w\") pod \"auto-csr-approver-29567874-nk5ld\" (UID: \"be3bba12-a082-489f-b9b2-d582337a8979\") " pod="openshift-infra/auto-csr-approver-29567874-nk5ld" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.505177 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567874-nk5ld" Mar 21 05:54:00 crc kubenswrapper[4580]: I0321 05:54:00.985540 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567874-nk5ld"] Mar 21 05:54:01 crc kubenswrapper[4580]: I0321 05:54:01.077083 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567874-nk5ld" event={"ID":"be3bba12-a082-489f-b9b2-d582337a8979","Type":"ContainerStarted","Data":"58a49565d8bbdbddf1e0593449d92ff1a231293c7aff479da0aae97fb281485e"} Mar 21 05:54:03 crc kubenswrapper[4580]: I0321 05:54:03.096467 4580 generic.go:334] "Generic (PLEG): container finished" podID="be3bba12-a082-489f-b9b2-d582337a8979" containerID="b339f9a642b036511c49aa5f4066e01413343ff3a677569945aac32589a46d19" exitCode=0 Mar 21 05:54:03 crc kubenswrapper[4580]: I0321 05:54:03.096610 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567874-nk5ld" event={"ID":"be3bba12-a082-489f-b9b2-d582337a8979","Type":"ContainerDied","Data":"b339f9a642b036511c49aa5f4066e01413343ff3a677569945aac32589a46d19"} Mar 21 05:54:04 crc kubenswrapper[4580]: I0321 05:54:04.447537 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567874-nk5ld" Mar 21 05:54:04 crc kubenswrapper[4580]: I0321 05:54:04.551759 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5w7w\" (UniqueName: \"kubernetes.io/projected/be3bba12-a082-489f-b9b2-d582337a8979-kube-api-access-m5w7w\") pod \"be3bba12-a082-489f-b9b2-d582337a8979\" (UID: \"be3bba12-a082-489f-b9b2-d582337a8979\") " Mar 21 05:54:04 crc kubenswrapper[4580]: I0321 05:54:04.558228 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be3bba12-a082-489f-b9b2-d582337a8979-kube-api-access-m5w7w" (OuterVolumeSpecName: "kube-api-access-m5w7w") pod "be3bba12-a082-489f-b9b2-d582337a8979" (UID: "be3bba12-a082-489f-b9b2-d582337a8979"). InnerVolumeSpecName "kube-api-access-m5w7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:54:04 crc kubenswrapper[4580]: I0321 05:54:04.654298 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5w7w\" (UniqueName: \"kubernetes.io/projected/be3bba12-a082-489f-b9b2-d582337a8979-kube-api-access-m5w7w\") on node \"crc\" DevicePath \"\"" Mar 21 05:54:05 crc kubenswrapper[4580]: I0321 05:54:05.117232 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567874-nk5ld" event={"ID":"be3bba12-a082-489f-b9b2-d582337a8979","Type":"ContainerDied","Data":"58a49565d8bbdbddf1e0593449d92ff1a231293c7aff479da0aae97fb281485e"} Mar 21 05:54:05 crc kubenswrapper[4580]: I0321 05:54:05.117270 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58a49565d8bbdbddf1e0593449d92ff1a231293c7aff479da0aae97fb281485e" Mar 21 05:54:05 crc kubenswrapper[4580]: I0321 05:54:05.117281 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567874-nk5ld" Mar 21 05:54:05 crc kubenswrapper[4580]: I0321 05:54:05.557039 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567868-vmcb5"] Mar 21 05:54:05 crc kubenswrapper[4580]: I0321 05:54:05.565207 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567868-vmcb5"] Mar 21 05:54:05 crc kubenswrapper[4580]: I0321 05:54:05.628339 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9730819c-907e-4985-858b-dbed32715065" path="/var/lib/kubelet/pods/9730819c-907e-4985-858b-dbed32715065/volumes" Mar 21 05:54:15 crc kubenswrapper[4580]: I0321 05:54:15.695432 4580 scope.go:117] "RemoveContainer" containerID="c630e88ceb7733f4e1e827d58edcbf064f7c68dd8a45360ceaca3f231c525bdf" Mar 21 05:54:15 crc kubenswrapper[4580]: I0321 05:54:15.948116 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:54:15 crc kubenswrapper[4580]: I0321 05:54:15.948412 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:54:15 crc kubenswrapper[4580]: I0321 05:54:15.948454 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 05:54:15 crc kubenswrapper[4580]: I0321 05:54:15.949226 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d3bcda5a760c6a54453a6b1a9b32ee5ceff6b3f290960e7e502c467bc51c6266"} pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:54:15 crc kubenswrapper[4580]: I0321 05:54:15.949275 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" containerID="cri-o://d3bcda5a760c6a54453a6b1a9b32ee5ceff6b3f290960e7e502c467bc51c6266" gracePeriod=600 Mar 21 05:54:16 crc kubenswrapper[4580]: I0321 05:54:16.232856 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerID="d3bcda5a760c6a54453a6b1a9b32ee5ceff6b3f290960e7e502c467bc51c6266" exitCode=0 Mar 21 05:54:16 crc kubenswrapper[4580]: I0321 05:54:16.232900 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerDied","Data":"d3bcda5a760c6a54453a6b1a9b32ee5ceff6b3f290960e7e502c467bc51c6266"} Mar 21 05:54:16 crc kubenswrapper[4580]: I0321 05:54:16.232956 4580 scope.go:117] "RemoveContainer" containerID="99f21d803b3709b1f6f0ada1fec60fe52cbbb05242093d6a8589a6cc22612886" Mar 21 05:54:17 crc kubenswrapper[4580]: I0321 05:54:17.242687 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12"} Mar 21 05:55:41 crc kubenswrapper[4580]: I0321 05:55:41.294479 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b4tks/must-gather-6c4ns"] Mar 21 05:55:41 crc kubenswrapper[4580]: E0321 05:55:41.299247 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be3bba12-a082-489f-b9b2-d582337a8979" containerName="oc" Mar 21 05:55:41 crc kubenswrapper[4580]: I0321 05:55:41.299265 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3bba12-a082-489f-b9b2-d582337a8979" containerName="oc" Mar 21 05:55:41 crc kubenswrapper[4580]: I0321 05:55:41.299501 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="be3bba12-a082-489f-b9b2-d582337a8979" containerName="oc" Mar 21 05:55:41 crc kubenswrapper[4580]: I0321 05:55:41.300542 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/must-gather-6c4ns" Mar 21 05:55:41 crc kubenswrapper[4580]: I0321 05:55:41.302521 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-b4tks"/"default-dockercfg-k286h" Mar 21 05:55:41 crc kubenswrapper[4580]: I0321 05:55:41.303802 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b4tks/must-gather-6c4ns"] Mar 21 05:55:41 crc kubenswrapper[4580]: I0321 05:55:41.303887 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b4tks"/"kube-root-ca.crt" Mar 21 05:55:41 crc kubenswrapper[4580]: I0321 05:55:41.303885 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b4tks"/"openshift-service-ca.crt" Mar 21 05:55:41 crc kubenswrapper[4580]: I0321 05:55:41.413242 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2d986e27-d896-4e73-9a01-c99895700d10-must-gather-output\") pod \"must-gather-6c4ns\" (UID: \"2d986e27-d896-4e73-9a01-c99895700d10\") " pod="openshift-must-gather-b4tks/must-gather-6c4ns" Mar 21 05:55:41 crc kubenswrapper[4580]: I0321 05:55:41.413300 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk2bk\" (UniqueName: \"kubernetes.io/projected/2d986e27-d896-4e73-9a01-c99895700d10-kube-api-access-rk2bk\") pod \"must-gather-6c4ns\" (UID: \"2d986e27-d896-4e73-9a01-c99895700d10\") " pod="openshift-must-gather-b4tks/must-gather-6c4ns" Mar 21 05:55:41 crc kubenswrapper[4580]: I0321 05:55:41.515901 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2d986e27-d896-4e73-9a01-c99895700d10-must-gather-output\") pod \"must-gather-6c4ns\" (UID: \"2d986e27-d896-4e73-9a01-c99895700d10\") " pod="openshift-must-gather-b4tks/must-gather-6c4ns" Mar 21 05:55:41 crc kubenswrapper[4580]: I0321 05:55:41.516246 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk2bk\" (UniqueName: \"kubernetes.io/projected/2d986e27-d896-4e73-9a01-c99895700d10-kube-api-access-rk2bk\") pod \"must-gather-6c4ns\" (UID: \"2d986e27-d896-4e73-9a01-c99895700d10\") " pod="openshift-must-gather-b4tks/must-gather-6c4ns" Mar 21 05:55:41 crc kubenswrapper[4580]: I0321 05:55:41.516355 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2d986e27-d896-4e73-9a01-c99895700d10-must-gather-output\") pod \"must-gather-6c4ns\" (UID: \"2d986e27-d896-4e73-9a01-c99895700d10\") " pod="openshift-must-gather-b4tks/must-gather-6c4ns" Mar 21 05:55:41 crc kubenswrapper[4580]: I0321 05:55:41.539412 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk2bk\" (UniqueName: \"kubernetes.io/projected/2d986e27-d896-4e73-9a01-c99895700d10-kube-api-access-rk2bk\") pod \"must-gather-6c4ns\" (UID: \"2d986e27-d896-4e73-9a01-c99895700d10\") " pod="openshift-must-gather-b4tks/must-gather-6c4ns" Mar 21 05:55:41 crc kubenswrapper[4580]: I0321 05:55:41.654535 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/must-gather-6c4ns" Mar 21 05:55:42 crc kubenswrapper[4580]: I0321 05:55:42.250226 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b4tks/must-gather-6c4ns"] Mar 21 05:55:43 crc kubenswrapper[4580]: I0321 05:55:43.075328 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b4tks/must-gather-6c4ns" event={"ID":"2d986e27-d896-4e73-9a01-c99895700d10","Type":"ContainerStarted","Data":"5313dff47ccf04da59b5ab712b584e95bbebbb240510d875928ce13240c4e1f4"} Mar 21 05:55:43 crc kubenswrapper[4580]: I0321 05:55:43.075905 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b4tks/must-gather-6c4ns" event={"ID":"2d986e27-d896-4e73-9a01-c99895700d10","Type":"ContainerStarted","Data":"155f69cf6352fe4ebb7de6e187c8475a9c62a77716a57bdfec71929e2f429654"} Mar 21 05:55:43 crc kubenswrapper[4580]: I0321 05:55:43.075919 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b4tks/must-gather-6c4ns" event={"ID":"2d986e27-d896-4e73-9a01-c99895700d10","Type":"ContainerStarted","Data":"4c160273e2badf920128742194a7fcecf8072a9d8bead55a7bc532e75a3d75d5"} Mar 21 05:55:43 crc kubenswrapper[4580]: I0321 05:55:43.135030 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b4tks/must-gather-6c4ns" podStartSLOduration=2.134994686 podStartE2EDuration="2.134994686s" podCreationTimestamp="2026-03-21 05:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:55:43.094434879 +0000 UTC m=+3848.177018507" watchObservedRunningTime="2026-03-21 05:55:43.134994686 +0000 UTC m=+3848.217578324" Mar 21 05:55:47 crc kubenswrapper[4580]: I0321 05:55:47.827363 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b4tks/crc-debug-wbd7p"] Mar 21 05:55:47 crc kubenswrapper[4580]: I0321 05:55:47.830074 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/crc-debug-wbd7p" Mar 21 05:55:47 crc kubenswrapper[4580]: I0321 05:55:47.893872 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7xth\" (UniqueName: \"kubernetes.io/projected/0f8440a6-52f5-4767-91ba-ba88bb319685-kube-api-access-v7xth\") pod \"crc-debug-wbd7p\" (UID: \"0f8440a6-52f5-4767-91ba-ba88bb319685\") " pod="openshift-must-gather-b4tks/crc-debug-wbd7p" Mar 21 05:55:47 crc kubenswrapper[4580]: I0321 05:55:47.894272 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f8440a6-52f5-4767-91ba-ba88bb319685-host\") pod \"crc-debug-wbd7p\" (UID: \"0f8440a6-52f5-4767-91ba-ba88bb319685\") " pod="openshift-must-gather-b4tks/crc-debug-wbd7p" Mar 21 05:55:47 crc kubenswrapper[4580]: I0321 05:55:47.997075 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7xth\" (UniqueName: \"kubernetes.io/projected/0f8440a6-52f5-4767-91ba-ba88bb319685-kube-api-access-v7xth\") pod \"crc-debug-wbd7p\" (UID: \"0f8440a6-52f5-4767-91ba-ba88bb319685\") " pod="openshift-must-gather-b4tks/crc-debug-wbd7p" Mar 21 05:55:47 crc kubenswrapper[4580]: I0321 05:55:47.997219 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f8440a6-52f5-4767-91ba-ba88bb319685-host\") pod \"crc-debug-wbd7p\" (UID: \"0f8440a6-52f5-4767-91ba-ba88bb319685\") " pod="openshift-must-gather-b4tks/crc-debug-wbd7p" Mar 21 05:55:47 crc kubenswrapper[4580]: I0321 05:55:47.997346 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f8440a6-52f5-4767-91ba-ba88bb319685-host\") pod \"crc-debug-wbd7p\" (UID: \"0f8440a6-52f5-4767-91ba-ba88bb319685\") " pod="openshift-must-gather-b4tks/crc-debug-wbd7p" Mar 21 05:55:48 crc kubenswrapper[4580]: I0321 05:55:48.018852 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7xth\" (UniqueName: \"kubernetes.io/projected/0f8440a6-52f5-4767-91ba-ba88bb319685-kube-api-access-v7xth\") pod \"crc-debug-wbd7p\" (UID: \"0f8440a6-52f5-4767-91ba-ba88bb319685\") " pod="openshift-must-gather-b4tks/crc-debug-wbd7p" Mar 21 05:55:48 crc kubenswrapper[4580]: I0321 05:55:48.151870 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/crc-debug-wbd7p" Mar 21 05:55:49 crc kubenswrapper[4580]: I0321 05:55:49.135417 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b4tks/crc-debug-wbd7p" event={"ID":"0f8440a6-52f5-4767-91ba-ba88bb319685","Type":"ContainerStarted","Data":"e27e778dbdf80afe1a92082948be63986d33051cff610253c485b5dbc3b1555c"} Mar 21 05:55:49 crc kubenswrapper[4580]: I0321 05:55:49.137390 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b4tks/crc-debug-wbd7p" event={"ID":"0f8440a6-52f5-4767-91ba-ba88bb319685","Type":"ContainerStarted","Data":"5db9dd17f9a029802e811fe00458435e10851d97af16087f72fd74ac4c6cc2c9"} Mar 21 05:55:49 crc kubenswrapper[4580]: I0321 05:55:49.155018 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b4tks/crc-debug-wbd7p" podStartSLOduration=2.154996364 podStartE2EDuration="2.154996364s" podCreationTimestamp="2026-03-21 05:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:55:49.152208298 +0000 UTC m=+3854.234791926" watchObservedRunningTime="2026-03-21 05:55:49.154996364 +0000 UTC m=+3854.237579992" Mar 21 05:56:00 crc kubenswrapper[4580]: I0321 05:56:00.160711 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567876-r6brf"] Mar 21 05:56:00 crc kubenswrapper[4580]: I0321 05:56:00.163566 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567876-r6brf" Mar 21 05:56:00 crc kubenswrapper[4580]: I0321 05:56:00.166580 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:56:00 crc kubenswrapper[4580]: I0321 05:56:00.167193 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:56:00 crc kubenswrapper[4580]: I0321 05:56:00.167235 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:56:00 crc kubenswrapper[4580]: I0321 05:56:00.183679 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567876-r6brf"] Mar 21 05:56:00 crc kubenswrapper[4580]: I0321 05:56:00.246832 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxwhv\" (UniqueName: \"kubernetes.io/projected/fd6acf86-a552-4342-a5e8-fb3fc12bed27-kube-api-access-zxwhv\") pod \"auto-csr-approver-29567876-r6brf\" (UID: \"fd6acf86-a552-4342-a5e8-fb3fc12bed27\") " pod="openshift-infra/auto-csr-approver-29567876-r6brf" Mar 21 05:56:00 crc kubenswrapper[4580]: I0321 05:56:00.349220 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxwhv\" (UniqueName: \"kubernetes.io/projected/fd6acf86-a552-4342-a5e8-fb3fc12bed27-kube-api-access-zxwhv\") pod \"auto-csr-approver-29567876-r6brf\" (UID: \"fd6acf86-a552-4342-a5e8-fb3fc12bed27\") " pod="openshift-infra/auto-csr-approver-29567876-r6brf" Mar 21 05:56:00 crc kubenswrapper[4580]: I0321 05:56:00.370236 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxwhv\" (UniqueName: \"kubernetes.io/projected/fd6acf86-a552-4342-a5e8-fb3fc12bed27-kube-api-access-zxwhv\") pod \"auto-csr-approver-29567876-r6brf\" (UID: \"fd6acf86-a552-4342-a5e8-fb3fc12bed27\") " pod="openshift-infra/auto-csr-approver-29567876-r6brf" Mar 21 05:56:00 crc kubenswrapper[4580]: I0321 05:56:00.490555 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567876-r6brf" Mar 21 05:56:01 crc kubenswrapper[4580]: I0321 05:56:01.076127 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567876-r6brf"] Mar 21 05:56:01 crc kubenswrapper[4580]: I0321 05:56:01.258138 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567876-r6brf" event={"ID":"fd6acf86-a552-4342-a5e8-fb3fc12bed27","Type":"ContainerStarted","Data":"f72f0a11bd36d66e4721a836ef3563d4f26398f12e61f75e505af5790f11bdcc"} Mar 21 05:56:03 crc kubenswrapper[4580]: I0321 05:56:03.314745 4580 generic.go:334] "Generic (PLEG): container finished" podID="fd6acf86-a552-4342-a5e8-fb3fc12bed27" containerID="25a9dc95778e9d3616efe72351a48efd85d2d781c1eed6b5c4b62cdcb9a67d24" exitCode=0 Mar 21 05:56:03 crc kubenswrapper[4580]: I0321 05:56:03.315292 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567876-r6brf" event={"ID":"fd6acf86-a552-4342-a5e8-fb3fc12bed27","Type":"ContainerDied","Data":"25a9dc95778e9d3616efe72351a48efd85d2d781c1eed6b5c4b62cdcb9a67d24"} Mar 21 05:56:04 crc kubenswrapper[4580]: I0321 05:56:04.706041 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567876-r6brf" Mar 21 05:56:04 crc kubenswrapper[4580]: I0321 05:56:04.749638 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxwhv\" (UniqueName: \"kubernetes.io/projected/fd6acf86-a552-4342-a5e8-fb3fc12bed27-kube-api-access-zxwhv\") pod \"fd6acf86-a552-4342-a5e8-fb3fc12bed27\" (UID: \"fd6acf86-a552-4342-a5e8-fb3fc12bed27\") " Mar 21 05:56:04 crc kubenswrapper[4580]: I0321 05:56:04.758007 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd6acf86-a552-4342-a5e8-fb3fc12bed27-kube-api-access-zxwhv" (OuterVolumeSpecName: "kube-api-access-zxwhv") pod "fd6acf86-a552-4342-a5e8-fb3fc12bed27" (UID: "fd6acf86-a552-4342-a5e8-fb3fc12bed27"). InnerVolumeSpecName "kube-api-access-zxwhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:56:04 crc kubenswrapper[4580]: I0321 05:56:04.851936 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxwhv\" (UniqueName: \"kubernetes.io/projected/fd6acf86-a552-4342-a5e8-fb3fc12bed27-kube-api-access-zxwhv\") on node \"crc\" DevicePath \"\"" Mar 21 05:56:05 crc kubenswrapper[4580]: I0321 05:56:05.347653 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567876-r6brf" event={"ID":"fd6acf86-a552-4342-a5e8-fb3fc12bed27","Type":"ContainerDied","Data":"f72f0a11bd36d66e4721a836ef3563d4f26398f12e61f75e505af5790f11bdcc"} Mar 21 05:56:05 crc kubenswrapper[4580]: I0321 05:56:05.347918 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f72f0a11bd36d66e4721a836ef3563d4f26398f12e61f75e505af5790f11bdcc" Mar 21 05:56:05 crc kubenswrapper[4580]: I0321 05:56:05.347709 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567876-r6brf" Mar 21 05:56:05 crc kubenswrapper[4580]: I0321 05:56:05.777484 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567870-np7k2"] Mar 21 05:56:05 crc kubenswrapper[4580]: I0321 05:56:05.787000 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567870-np7k2"] Mar 21 05:56:07 crc kubenswrapper[4580]: I0321 05:56:07.630214 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd778e7-6536-464a-8ce1-e9a26c7d8c58" path="/var/lib/kubelet/pods/0dd778e7-6536-464a-8ce1-e9a26c7d8c58/volumes" Mar 21 05:56:15 crc kubenswrapper[4580]: I0321 05:56:15.776220 4580 scope.go:117] "RemoveContainer" containerID="b70ae59290c3cc82e651e88234901a2d9a69ac07c47a3c956080528e0834621f" Mar 21 05:56:28 crc kubenswrapper[4580]: I0321 05:56:28.565725 4580 generic.go:334] "Generic (PLEG): container finished" podID="0f8440a6-52f5-4767-91ba-ba88bb319685" containerID="e27e778dbdf80afe1a92082948be63986d33051cff610253c485b5dbc3b1555c" exitCode=0 Mar 21 05:56:28 crc kubenswrapper[4580]: I0321 05:56:28.565827 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b4tks/crc-debug-wbd7p" event={"ID":"0f8440a6-52f5-4767-91ba-ba88bb319685","Type":"ContainerDied","Data":"e27e778dbdf80afe1a92082948be63986d33051cff610253c485b5dbc3b1555c"} Mar 21 05:56:29 crc kubenswrapper[4580]: I0321 05:56:29.697635 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/crc-debug-wbd7p" Mar 21 05:56:29 crc kubenswrapper[4580]: I0321 05:56:29.727645 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b4tks/crc-debug-wbd7p"] Mar 21 05:56:29 crc kubenswrapper[4580]: I0321 05:56:29.735482 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7xth\" (UniqueName: \"kubernetes.io/projected/0f8440a6-52f5-4767-91ba-ba88bb319685-kube-api-access-v7xth\") pod \"0f8440a6-52f5-4767-91ba-ba88bb319685\" (UID: \"0f8440a6-52f5-4767-91ba-ba88bb319685\") " Mar 21 05:56:29 crc kubenswrapper[4580]: I0321 05:56:29.735892 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f8440a6-52f5-4767-91ba-ba88bb319685-host\") pod \"0f8440a6-52f5-4767-91ba-ba88bb319685\" (UID: \"0f8440a6-52f5-4767-91ba-ba88bb319685\") " Mar 21 05:56:29 crc kubenswrapper[4580]: I0321 05:56:29.735898 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b4tks/crc-debug-wbd7p"] Mar 21 05:56:29 crc kubenswrapper[4580]: I0321 05:56:29.736476 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f8440a6-52f5-4767-91ba-ba88bb319685-host" (OuterVolumeSpecName: "host") pod "0f8440a6-52f5-4767-91ba-ba88bb319685" (UID: "0f8440a6-52f5-4767-91ba-ba88bb319685"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:56:29 crc kubenswrapper[4580]: I0321 05:56:29.747478 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f8440a6-52f5-4767-91ba-ba88bb319685-kube-api-access-v7xth" (OuterVolumeSpecName: "kube-api-access-v7xth") pod "0f8440a6-52f5-4767-91ba-ba88bb319685" (UID: "0f8440a6-52f5-4767-91ba-ba88bb319685"). InnerVolumeSpecName "kube-api-access-v7xth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:56:29 crc kubenswrapper[4580]: I0321 05:56:29.837883 4580 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f8440a6-52f5-4767-91ba-ba88bb319685-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:56:29 crc kubenswrapper[4580]: I0321 05:56:29.837919 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7xth\" (UniqueName: \"kubernetes.io/projected/0f8440a6-52f5-4767-91ba-ba88bb319685-kube-api-access-v7xth\") on node \"crc\" DevicePath \"\"" Mar 21 05:56:30 crc kubenswrapper[4580]: I0321 05:56:30.585727 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5db9dd17f9a029802e811fe00458435e10851d97af16087f72fd74ac4c6cc2c9" Mar 21 05:56:30 crc kubenswrapper[4580]: I0321 05:56:30.585799 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/crc-debug-wbd7p" Mar 21 05:56:30 crc kubenswrapper[4580]: I0321 05:56:30.933872 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b4tks/crc-debug-89l2k"] Mar 21 05:56:30 crc kubenswrapper[4580]: E0321 05:56:30.934346 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8440a6-52f5-4767-91ba-ba88bb319685" containerName="container-00" Mar 21 05:56:30 crc kubenswrapper[4580]: I0321 05:56:30.934361 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8440a6-52f5-4767-91ba-ba88bb319685" containerName="container-00" Mar 21 05:56:30 crc kubenswrapper[4580]: E0321 05:56:30.934386 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6acf86-a552-4342-a5e8-fb3fc12bed27" containerName="oc" Mar 21 05:56:30 crc kubenswrapper[4580]: I0321 05:56:30.934395 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6acf86-a552-4342-a5e8-fb3fc12bed27" containerName="oc" Mar 21 05:56:30 crc kubenswrapper[4580]: I0321 05:56:30.934638 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f8440a6-52f5-4767-91ba-ba88bb319685" containerName="container-00" Mar 21 05:56:30 crc kubenswrapper[4580]: I0321 05:56:30.934673 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd6acf86-a552-4342-a5e8-fb3fc12bed27" containerName="oc" Mar 21 05:56:30 crc kubenswrapper[4580]: I0321 05:56:30.935515 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/crc-debug-89l2k" Mar 21 05:56:30 crc kubenswrapper[4580]: I0321 05:56:30.956758 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhqfv\" (UniqueName: \"kubernetes.io/projected/214e8054-6f70-4078-9418-7ab26e0a523e-kube-api-access-dhqfv\") pod \"crc-debug-89l2k\" (UID: \"214e8054-6f70-4078-9418-7ab26e0a523e\") " pod="openshift-must-gather-b4tks/crc-debug-89l2k" Mar 21 05:56:30 crc kubenswrapper[4580]: I0321 05:56:30.956860 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/214e8054-6f70-4078-9418-7ab26e0a523e-host\") pod \"crc-debug-89l2k\" (UID: \"214e8054-6f70-4078-9418-7ab26e0a523e\") " pod="openshift-must-gather-b4tks/crc-debug-89l2k" Mar 21 05:56:31 crc kubenswrapper[4580]: I0321 05:56:31.058620 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhqfv\" (UniqueName: \"kubernetes.io/projected/214e8054-6f70-4078-9418-7ab26e0a523e-kube-api-access-dhqfv\") pod \"crc-debug-89l2k\" (UID: \"214e8054-6f70-4078-9418-7ab26e0a523e\") " pod="openshift-must-gather-b4tks/crc-debug-89l2k" Mar 21 05:56:31 crc kubenswrapper[4580]: I0321 05:56:31.058712 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/214e8054-6f70-4078-9418-7ab26e0a523e-host\") pod \"crc-debug-89l2k\" (UID: \"214e8054-6f70-4078-9418-7ab26e0a523e\") " pod="openshift-must-gather-b4tks/crc-debug-89l2k" Mar 21 05:56:31 crc kubenswrapper[4580]: I0321 05:56:31.058899 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/214e8054-6f70-4078-9418-7ab26e0a523e-host\") pod \"crc-debug-89l2k\" (UID: \"214e8054-6f70-4078-9418-7ab26e0a523e\") " pod="openshift-must-gather-b4tks/crc-debug-89l2k" Mar 21 05:56:31 crc kubenswrapper[4580]: I0321 05:56:31.077525 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhqfv\" (UniqueName: \"kubernetes.io/projected/214e8054-6f70-4078-9418-7ab26e0a523e-kube-api-access-dhqfv\") pod \"crc-debug-89l2k\" (UID: \"214e8054-6f70-4078-9418-7ab26e0a523e\") " pod="openshift-must-gather-b4tks/crc-debug-89l2k" Mar 21 05:56:31 crc kubenswrapper[4580]: I0321 05:56:31.258443 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/crc-debug-89l2k" Mar 21 05:56:31 crc kubenswrapper[4580]: I0321 05:56:31.597333 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b4tks/crc-debug-89l2k" event={"ID":"214e8054-6f70-4078-9418-7ab26e0a523e","Type":"ContainerStarted","Data":"a8377e11eea0ede5eb647c2e39b3cec914436b6b85fff3cf3d01a44d28da1e68"} Mar 21 05:56:31 crc kubenswrapper[4580]: I0321 05:56:31.597579 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b4tks/crc-debug-89l2k" event={"ID":"214e8054-6f70-4078-9418-7ab26e0a523e","Type":"ContainerStarted","Data":"830cac87b6a2d5a0f8e6876de8fcbe7c2cbc1af7a2c5d65b01dcb1d22d549380"} Mar 21 05:56:31 crc kubenswrapper[4580]: I0321 05:56:31.634923 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f8440a6-52f5-4767-91ba-ba88bb319685" path="/var/lib/kubelet/pods/0f8440a6-52f5-4767-91ba-ba88bb319685/volumes" Mar 21 05:56:32 crc kubenswrapper[4580]: I0321 05:56:32.608708 4580 generic.go:334] "Generic (PLEG): container finished" podID="214e8054-6f70-4078-9418-7ab26e0a523e" containerID="a8377e11eea0ede5eb647c2e39b3cec914436b6b85fff3cf3d01a44d28da1e68" exitCode=0 Mar 21 05:56:32 crc kubenswrapper[4580]: I0321 05:56:32.608818 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b4tks/crc-debug-89l2k" event={"ID":"214e8054-6f70-4078-9418-7ab26e0a523e","Type":"ContainerDied","Data":"a8377e11eea0ede5eb647c2e39b3cec914436b6b85fff3cf3d01a44d28da1e68"} Mar 21 05:56:34 crc kubenswrapper[4580]: I0321 05:56:34.249480 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/crc-debug-89l2k" Mar 21 05:56:34 crc kubenswrapper[4580]: I0321 05:56:34.299436 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b4tks/crc-debug-89l2k"] Mar 21 05:56:34 crc kubenswrapper[4580]: I0321 05:56:34.308554 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b4tks/crc-debug-89l2k"] Mar 21 05:56:34 crc kubenswrapper[4580]: I0321 05:56:34.326619 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/214e8054-6f70-4078-9418-7ab26e0a523e-host\") pod \"214e8054-6f70-4078-9418-7ab26e0a523e\" (UID: \"214e8054-6f70-4078-9418-7ab26e0a523e\") " Mar 21 05:56:34 crc kubenswrapper[4580]: I0321 05:56:34.326744 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/214e8054-6f70-4078-9418-7ab26e0a523e-host" (OuterVolumeSpecName: "host") pod "214e8054-6f70-4078-9418-7ab26e0a523e" (UID: "214e8054-6f70-4078-9418-7ab26e0a523e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:56:34 crc kubenswrapper[4580]: I0321 05:56:34.326810 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhqfv\" (UniqueName: \"kubernetes.io/projected/214e8054-6f70-4078-9418-7ab26e0a523e-kube-api-access-dhqfv\") pod \"214e8054-6f70-4078-9418-7ab26e0a523e\" (UID: \"214e8054-6f70-4078-9418-7ab26e0a523e\") " Mar 21 05:56:34 crc kubenswrapper[4580]: I0321 05:56:34.327233 4580 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/214e8054-6f70-4078-9418-7ab26e0a523e-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:56:34 crc kubenswrapper[4580]: I0321 05:56:34.332192 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/214e8054-6f70-4078-9418-7ab26e0a523e-kube-api-access-dhqfv" (OuterVolumeSpecName: "kube-api-access-dhqfv") pod "214e8054-6f70-4078-9418-7ab26e0a523e" (UID: "214e8054-6f70-4078-9418-7ab26e0a523e"). InnerVolumeSpecName "kube-api-access-dhqfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:56:34 crc kubenswrapper[4580]: I0321 05:56:34.428676 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhqfv\" (UniqueName: \"kubernetes.io/projected/214e8054-6f70-4078-9418-7ab26e0a523e-kube-api-access-dhqfv\") on node \"crc\" DevicePath \"\"" Mar 21 05:56:34 crc kubenswrapper[4580]: I0321 05:56:34.633459 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="830cac87b6a2d5a0f8e6876de8fcbe7c2cbc1af7a2c5d65b01dcb1d22d549380" Mar 21 05:56:34 crc kubenswrapper[4580]: I0321 05:56:34.633528 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/crc-debug-89l2k" Mar 21 05:56:35 crc kubenswrapper[4580]: I0321 05:56:35.636586 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="214e8054-6f70-4078-9418-7ab26e0a523e" path="/var/lib/kubelet/pods/214e8054-6f70-4078-9418-7ab26e0a523e/volumes" Mar 21 05:56:35 crc kubenswrapper[4580]: I0321 05:56:35.674182 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b4tks/crc-debug-hshf4"] Mar 21 05:56:35 crc kubenswrapper[4580]: E0321 05:56:35.678567 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214e8054-6f70-4078-9418-7ab26e0a523e" containerName="container-00" Mar 21 05:56:35 crc kubenswrapper[4580]: I0321 05:56:35.678623 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="214e8054-6f70-4078-9418-7ab26e0a523e" containerName="container-00" Mar 21 05:56:35 crc kubenswrapper[4580]: I0321 05:56:35.678977 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="214e8054-6f70-4078-9418-7ab26e0a523e" containerName="container-00" Mar 21 05:56:35 crc kubenswrapper[4580]: I0321 05:56:35.680108 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/crc-debug-hshf4" Mar 21 05:56:35 crc kubenswrapper[4580]: I0321 05:56:35.754303 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91a78f57-38b4-4602-a0d0-110594a5442b-host\") pod \"crc-debug-hshf4\" (UID: \"91a78f57-38b4-4602-a0d0-110594a5442b\") " pod="openshift-must-gather-b4tks/crc-debug-hshf4" Mar 21 05:56:35 crc kubenswrapper[4580]: I0321 05:56:35.754617 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnt5d\" (UniqueName: \"kubernetes.io/projected/91a78f57-38b4-4602-a0d0-110594a5442b-kube-api-access-fnt5d\") pod \"crc-debug-hshf4\" (UID: \"91a78f57-38b4-4602-a0d0-110594a5442b\") " pod="openshift-must-gather-b4tks/crc-debug-hshf4" Mar 21 05:56:35 crc kubenswrapper[4580]: I0321 05:56:35.856372 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91a78f57-38b4-4602-a0d0-110594a5442b-host\") pod \"crc-debug-hshf4\" (UID: \"91a78f57-38b4-4602-a0d0-110594a5442b\") " pod="openshift-must-gather-b4tks/crc-debug-hshf4" Mar 21 05:56:35 crc kubenswrapper[4580]: I0321 05:56:35.856485 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnt5d\" (UniqueName: \"kubernetes.io/projected/91a78f57-38b4-4602-a0d0-110594a5442b-kube-api-access-fnt5d\") pod \"crc-debug-hshf4\" (UID: \"91a78f57-38b4-4602-a0d0-110594a5442b\") " pod="openshift-must-gather-b4tks/crc-debug-hshf4" Mar 21 05:56:35 crc kubenswrapper[4580]: I0321 05:56:35.856673 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91a78f57-38b4-4602-a0d0-110594a5442b-host\") pod \"crc-debug-hshf4\" (UID: \"91a78f57-38b4-4602-a0d0-110594a5442b\") " pod="openshift-must-gather-b4tks/crc-debug-hshf4" Mar 21 05:56:35 crc kubenswrapper[4580]: I0321 05:56:35.878140 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnt5d\" (UniqueName: \"kubernetes.io/projected/91a78f57-38b4-4602-a0d0-110594a5442b-kube-api-access-fnt5d\") pod \"crc-debug-hshf4\" (UID: \"91a78f57-38b4-4602-a0d0-110594a5442b\") " pod="openshift-must-gather-b4tks/crc-debug-hshf4" Mar 21 05:56:36 crc kubenswrapper[4580]: I0321 05:56:36.011090 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/crc-debug-hshf4" Mar 21 05:56:36 crc kubenswrapper[4580]: W0321 05:56:36.050927 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91a78f57_38b4_4602_a0d0_110594a5442b.slice/crio-98945ca46655e9ccf7b5fd01f4f940fbe21491b67c1396bdf2c55354a8ea14ff WatchSource:0}: Error finding container 98945ca46655e9ccf7b5fd01f4f940fbe21491b67c1396bdf2c55354a8ea14ff: Status 404 returned error can't find the container with id 98945ca46655e9ccf7b5fd01f4f940fbe21491b67c1396bdf2c55354a8ea14ff Mar 21 05:56:36 crc kubenswrapper[4580]: I0321 05:56:36.659318 4580 generic.go:334] "Generic (PLEG): container finished" podID="91a78f57-38b4-4602-a0d0-110594a5442b" containerID="3ed67e6da8f6e4f75eb70e867abdae009cfa41e31be1c42734e62bb249d14f85" exitCode=0 Mar 21 05:56:36 crc kubenswrapper[4580]: I0321 05:56:36.659437 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b4tks/crc-debug-hshf4" event={"ID":"91a78f57-38b4-4602-a0d0-110594a5442b","Type":"ContainerDied","Data":"3ed67e6da8f6e4f75eb70e867abdae009cfa41e31be1c42734e62bb249d14f85"} Mar 21 05:56:36 crc kubenswrapper[4580]: I0321 05:56:36.659675 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b4tks/crc-debug-hshf4" event={"ID":"91a78f57-38b4-4602-a0d0-110594a5442b","Type":"ContainerStarted","Data":"98945ca46655e9ccf7b5fd01f4f940fbe21491b67c1396bdf2c55354a8ea14ff"} Mar 21 05:56:36 crc kubenswrapper[4580]: I0321 05:56:36.700439 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b4tks/crc-debug-hshf4"] Mar 21 05:56:36 crc kubenswrapper[4580]: I0321 05:56:36.715470 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b4tks/crc-debug-hshf4"] Mar 21 05:56:37 crc kubenswrapper[4580]: I0321 05:56:37.764858 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/crc-debug-hshf4" Mar 21 05:56:37 crc kubenswrapper[4580]: I0321 05:56:37.812326 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91a78f57-38b4-4602-a0d0-110594a5442b-host\") pod \"91a78f57-38b4-4602-a0d0-110594a5442b\" (UID: \"91a78f57-38b4-4602-a0d0-110594a5442b\") " Mar 21 05:56:37 crc kubenswrapper[4580]: I0321 05:56:37.812448 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnt5d\" (UniqueName: \"kubernetes.io/projected/91a78f57-38b4-4602-a0d0-110594a5442b-kube-api-access-fnt5d\") pod \"91a78f57-38b4-4602-a0d0-110594a5442b\" (UID: \"91a78f57-38b4-4602-a0d0-110594a5442b\") " Mar 21 05:56:37 crc kubenswrapper[4580]: I0321 05:56:37.812459 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91a78f57-38b4-4602-a0d0-110594a5442b-host" (OuterVolumeSpecName: "host") pod "91a78f57-38b4-4602-a0d0-110594a5442b" (UID: "91a78f57-38b4-4602-a0d0-110594a5442b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:56:37 crc kubenswrapper[4580]: I0321 05:56:37.813180 4580 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91a78f57-38b4-4602-a0d0-110594a5442b-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:56:37 crc kubenswrapper[4580]: I0321 05:56:37.818363 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a78f57-38b4-4602-a0d0-110594a5442b-kube-api-access-fnt5d" (OuterVolumeSpecName: "kube-api-access-fnt5d") pod "91a78f57-38b4-4602-a0d0-110594a5442b" (UID: "91a78f57-38b4-4602-a0d0-110594a5442b"). InnerVolumeSpecName "kube-api-access-fnt5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:56:37 crc kubenswrapper[4580]: I0321 05:56:37.915135 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnt5d\" (UniqueName: \"kubernetes.io/projected/91a78f57-38b4-4602-a0d0-110594a5442b-kube-api-access-fnt5d\") on node \"crc\" DevicePath \"\"" Mar 21 05:56:38 crc kubenswrapper[4580]: I0321 05:56:38.680551 4580 scope.go:117] "RemoveContainer" containerID="3ed67e6da8f6e4f75eb70e867abdae009cfa41e31be1c42734e62bb249d14f85" Mar 21 05:56:38 crc kubenswrapper[4580]: I0321 05:56:38.680952 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/crc-debug-hshf4" Mar 21 05:56:39 crc kubenswrapper[4580]: I0321 05:56:39.631731 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91a78f57-38b4-4602-a0d0-110594a5442b" path="/var/lib/kubelet/pods/91a78f57-38b4-4602-a0d0-110594a5442b/volumes" Mar 21 05:56:45 crc kubenswrapper[4580]: I0321 05:56:45.948337 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:56:45 crc kubenswrapper[4580]: I0321 05:56:45.950173 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:57:15 crc kubenswrapper[4580]: I0321 05:57:15.265303 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d486fc764-m7r7b_c225fecd-c259-40cb-898c-78dc724d1db8/barbican-api/0.log" Mar 21 05:57:15 crc kubenswrapper[4580]: I0321 05:57:15.440977 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d486fc764-m7r7b_c225fecd-c259-40cb-898c-78dc724d1db8/barbican-api-log/0.log" Mar 21 05:57:15 crc kubenswrapper[4580]: I0321 05:57:15.517691 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cfb6cbb9d-ln66c_0f4b4bf3-0508-4021-916b-97694fe670ff/barbican-keystone-listener/0.log" Mar 21 05:57:15 crc kubenswrapper[4580]: I0321 05:57:15.767440 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cfb6cbb9d-ln66c_0f4b4bf3-0508-4021-916b-97694fe670ff/barbican-keystone-listener-log/0.log" Mar 21 05:57:15 crc kubenswrapper[4580]: I0321 05:57:15.801203 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-787f545779-9db4b_d79bd04a-35d0-48ab-883f-982e3129d435/barbican-worker-log/0.log" Mar 21 05:57:15 crc kubenswrapper[4580]: I0321 05:57:15.853549 4580 scope.go:117] "RemoveContainer" containerID="cdbb3ebaa672afedf8c9f725012e72295f6ee9a7c0ec98e72dbeb5894cf75ef3" Mar 21 05:57:15 crc kubenswrapper[4580]: I0321 05:57:15.882473 4580 scope.go:117] "RemoveContainer" containerID="4face93dcb74c9a008ad3d6aa56b70aa8839a2166d7c98133b5bb7170881e2ca" Mar 21 05:57:15 crc kubenswrapper[4580]: I0321 05:57:15.900238 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-787f545779-9db4b_d79bd04a-35d0-48ab-883f-982e3129d435/barbican-worker/0.log" Mar 21 05:57:15 crc kubenswrapper[4580]: I0321 05:57:15.948312 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:57:15 crc kubenswrapper[4580]: I0321 05:57:15.948399 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:57:15 crc kubenswrapper[4580]: I0321 05:57:15.958343 4580 scope.go:117] "RemoveContainer" containerID="1f78d2990e5621d499ed84708e11d2bc0ba05617aff84e3886e0b410124f5812" Mar 21 05:57:16 crc kubenswrapper[4580]: I0321 05:57:16.300403 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-94x8x_ddfb2a5d-1386-4dac-aee6-316bce48c76b/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:16 crc kubenswrapper[4580]: I0321 05:57:16.312060 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c77c9b9f-3e73-4cef-9e10-39bfef8357b5/ceilometer-central-agent/0.log" Mar 21 05:57:16 crc kubenswrapper[4580]: I0321 05:57:16.380268 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c77c9b9f-3e73-4cef-9e10-39bfef8357b5/ceilometer-notification-agent/0.log" Mar 21 05:57:16 crc kubenswrapper[4580]: I0321 05:57:16.433722 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c77c9b9f-3e73-4cef-9e10-39bfef8357b5/proxy-httpd/0.log" Mar 21 05:57:16 crc kubenswrapper[4580]: I0321 05:57:16.534234 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c77c9b9f-3e73-4cef-9e10-39bfef8357b5/sg-core/0.log" Mar 21 05:57:16 crc kubenswrapper[4580]: I0321 05:57:16.640666 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_18848c19-7735-494d-babb-32e04c8ef382/cinder-api/0.log" Mar 21 05:57:16 crc kubenswrapper[4580]: I0321 05:57:16.690888 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_18848c19-7735-494d-babb-32e04c8ef382/cinder-api-log/0.log" Mar 21 05:57:16 crc kubenswrapper[4580]: I0321 05:57:16.857192 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a6a3d4de-9969-48f3-9f1a-9f273f81050a/cinder-scheduler/0.log" Mar 21 05:57:16 crc kubenswrapper[4580]: I0321 05:57:16.861007 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a6a3d4de-9969-48f3-9f1a-9f273f81050a/probe/0.log" Mar 21 05:57:17 crc kubenswrapper[4580]: I0321 05:57:17.299433 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-4chpn_5014a479-6112-4f5c-9824-db4736d248f4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:17 crc kubenswrapper[4580]: I0321 05:57:17.439276 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-gck9n_136b85ae-b1b7-46cf-a8fa-059f29999f31/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:17 crc kubenswrapper[4580]: I0321 05:57:17.523634 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6cd9bffc9-mrwrr_0a878571-91e7-486e-8258-fc3298a5e03f/init/0.log" Mar 21 05:57:17 crc kubenswrapper[4580]: I0321 05:57:17.759502 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6cd9bffc9-mrwrr_0a878571-91e7-486e-8258-fc3298a5e03f/dnsmasq-dns/0.log" Mar 21 05:57:17 crc kubenswrapper[4580]: I0321 05:57:17.790438 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6cd9bffc9-mrwrr_0a878571-91e7-486e-8258-fc3298a5e03f/init/0.log" Mar 21 05:57:17 crc kubenswrapper[4580]: I0321 05:57:17.865714 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-fl44d_73c35bcd-08ba-44f4-96c4-4d29bcf84b5f/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:17 crc kubenswrapper[4580]: I0321 05:57:17.974400 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_551ce5a9-fc21-4f0c-9c38-d53b829c5979/glance-httpd/0.log" Mar 21 05:57:18 crc kubenswrapper[4580]: I0321 05:57:18.082773 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_551ce5a9-fc21-4f0c-9c38-d53b829c5979/glance-log/0.log" Mar 21 05:57:18 crc kubenswrapper[4580]: I0321 05:57:18.253142 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f151509-94d2-4991-89f7-c7757d14b867/glance-httpd/0.log" Mar 21 05:57:18 crc kubenswrapper[4580]: I0321 05:57:18.271665 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f151509-94d2-4991-89f7-c7757d14b867/glance-log/0.log" Mar 21 05:57:18 crc kubenswrapper[4580]: I0321 05:57:18.433972 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-67655f8b6-mbx6n_a03ce0fa-f7e8-4b48-bbea-95807f14dd26/horizon/4.log" Mar 21 05:57:18 crc kubenswrapper[4580]: I0321 05:57:18.629018 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-67655f8b6-mbx6n_a03ce0fa-f7e8-4b48-bbea-95807f14dd26/horizon/3.log" Mar 21 05:57:18 crc kubenswrapper[4580]: I0321 05:57:18.816469 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-7skn5_72b35ded-99db-471e-b265-5e1e0467af49/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:18 crc kubenswrapper[4580]: I0321 05:57:18.939049 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-67655f8b6-mbx6n_a03ce0fa-f7e8-4b48-bbea-95807f14dd26/horizon-log/0.log" Mar 21 05:57:19 crc kubenswrapper[4580]: I0321 05:57:19.548309 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6d45658b5d-dfjj4_19491f31-c899-4d84-a81b-262d0660b2c1/keystone-api/0.log" Mar 21 05:57:19 crc kubenswrapper[4580]: I0321 05:57:19.598039 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-5b8sc_42a67be6-2662-40e1-a94d-0b7fa55c1bc0/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:19 crc kubenswrapper[4580]: I0321 05:57:19.635048 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ca4bc346-fdf8-4e43-8bbb-ea6c80333c43/kube-state-metrics/0.log" Mar 21 05:57:20 crc kubenswrapper[4580]: I0321 05:57:20.186604 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6cd755485-pmnqc_0804de84-fb1f-40cf-af99-b67d2eb64fc4/neutron-api/0.log" Mar 21 05:57:20 crc kubenswrapper[4580]: I0321 05:57:20.265591 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6cd755485-pmnqc_0804de84-fb1f-40cf-af99-b67d2eb64fc4/neutron-httpd/0.log" Mar 21 05:57:20 crc kubenswrapper[4580]: I0321 05:57:20.804427 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8hpj_24785e2f-2d74-4dd1-97dd-10e58843652e/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:21 crc kubenswrapper[4580]: I0321 05:57:21.021395 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ae93ee16-d710-434d-b070-65215d559dfb/nova-api-log/0.log" Mar 21 05:57:21 crc kubenswrapper[4580]: I0321 05:57:21.222167 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6dvtv_e355e210-9abe-4bdf-bcbf-70e95e437482/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:21 crc kubenswrapper[4580]: I0321 05:57:21.396929 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ae93ee16-d710-434d-b070-65215d559dfb/nova-api-api/0.log" Mar 21 05:57:21 crc kubenswrapper[4580]: I0321 05:57:21.558591 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_dc118c0b-9b79-4e70-a775-a437c1b83b2c/nova-cell0-conductor-conductor/0.log" Mar 21 05:57:21 crc kubenswrapper[4580]: I0321 05:57:21.635314 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_fcc0c177-5dea-46ab-9eb9-aa66a23d909f/nova-cell1-conductor-conductor/0.log" Mar 21 05:57:21 crc kubenswrapper[4580]: I0321 05:57:21.840920 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b7bd64a0-ec65-4f8c-841c-ca1950434439/nova-cell1-novncproxy-novncproxy/0.log" Mar 21 05:57:22 crc kubenswrapper[4580]: I0321 05:57:22.301476 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_984d329d-aa14-46fb-9c9f-c5f9eb415f73/nova-metadata-log/0.log" Mar 21 05:57:22 crc kubenswrapper[4580]: I0321 05:57:22.749144 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6b628f21-06f6-4838-805f-b0d25851ac35/nova-scheduler-scheduler/0.log" Mar 21 05:57:22 crc kubenswrapper[4580]: I0321 05:57:22.802034 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_984d329d-aa14-46fb-9c9f-c5f9eb415f73/nova-metadata-metadata/0.log" Mar 21 05:57:22 crc kubenswrapper[4580]: I0321 05:57:22.812597 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2da281c0-51d3-4264-8924-83dbc85ecbf0/mysql-bootstrap/0.log" Mar 21 05:57:23 crc kubenswrapper[4580]: I0321 05:57:23.010141 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2da281c0-51d3-4264-8924-83dbc85ecbf0/mysql-bootstrap/0.log" Mar 21 05:57:23 crc kubenswrapper[4580]: I0321 05:57:23.068503 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2da281c0-51d3-4264-8924-83dbc85ecbf0/galera/0.log" Mar 21 05:57:23 crc kubenswrapper[4580]: I0321 05:57:23.262329 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b4f4841a-f9ee-4d9d-b756-77cabd20363a/mysql-bootstrap/0.log" Mar 21 05:57:23 crc kubenswrapper[4580]: I0321 05:57:23.278009 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-gxx9h_bf805790-d6ce-495d-8d85-dd7cf68b4bf3/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:23 crc kubenswrapper[4580]: I0321 05:57:23.536512 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b4f4841a-f9ee-4d9d-b756-77cabd20363a/mysql-bootstrap/0.log" Mar 21 05:57:23 crc kubenswrapper[4580]: I0321 05:57:23.600141 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b4f4841a-f9ee-4d9d-b756-77cabd20363a/galera/0.log" Mar 21 05:57:23 crc kubenswrapper[4580]: I0321 05:57:23.631386 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_286ff68a-a9d7-4592-9146-f9537c8cf329/openstackclient/0.log" Mar 21 05:57:23 crc kubenswrapper[4580]: I0321 05:57:23.859233 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-jjv5q_15016044-062f-44bc-8278-97a43b709083/ovn-controller/0.log" Mar 21 05:57:23 crc kubenswrapper[4580]: I0321 05:57:23.978116 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-vpk6m_cc8eda41-b1d2-4f48-ac6e-59b7856a0917/openstack-network-exporter/0.log" Mar 21 05:57:24 crc kubenswrapper[4580]: I0321 05:57:24.170006 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tqfdg_893ab010-283a-4331-834a-05586719a352/ovsdb-server-init/0.log" Mar 21 05:57:24 crc kubenswrapper[4580]: I0321 05:57:24.516445 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tqfdg_893ab010-283a-4331-834a-05586719a352/ovsdb-server-init/0.log" Mar 21 05:57:24 crc kubenswrapper[4580]: I0321 05:57:24.521981 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tqfdg_893ab010-283a-4331-834a-05586719a352/ovs-vswitchd/0.log" Mar 21 05:57:24 crc kubenswrapper[4580]: I0321 05:57:24.542732 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tqfdg_893ab010-283a-4331-834a-05586719a352/ovsdb-server/0.log" Mar 21 05:57:24 crc kubenswrapper[4580]: I0321 05:57:24.871569 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_69db8b67-aa51-41d9-8088-dba10b9bdd0d/ovn-northd/0.log" Mar 21 05:57:24 crc kubenswrapper[4580]: I0321 05:57:24.905145 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_69db8b67-aa51-41d9-8088-dba10b9bdd0d/openstack-network-exporter/0.log" Mar 21 05:57:25 crc kubenswrapper[4580]: I0321 05:57:25.007113 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-j2klj_fe57de6b-1ee3-4bdb-91b8-d81369a7fc72/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:25 crc kubenswrapper[4580]: I0321 05:57:25.186690 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_358c9476-8608-43e6-9912-6be4fb3f2ba8/ovsdbserver-nb/0.log" Mar 21 05:57:25 crc kubenswrapper[4580]: I0321 05:57:25.291058 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_358c9476-8608-43e6-9912-6be4fb3f2ba8/openstack-network-exporter/0.log" Mar 21 05:57:25 crc kubenswrapper[4580]: I0321 05:57:25.375651 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_514b5967-88ad-43e2-aa38-88551fba381d/openstack-network-exporter/0.log" Mar 21 05:57:25 crc kubenswrapper[4580]: I0321 05:57:25.484158 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_514b5967-88ad-43e2-aa38-88551fba381d/ovsdbserver-sb/0.log" Mar 21 05:57:25 crc kubenswrapper[4580]: I0321 05:57:25.731599 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67f74b898d-dtzvd_3cbb1901-c5ee-4f46-aa6d-ac31372a9b83/placement-api/0.log" Mar 21 05:57:25 crc kubenswrapper[4580]: I0321 05:57:25.754790 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67f74b898d-dtzvd_3cbb1901-c5ee-4f46-aa6d-ac31372a9b83/placement-log/0.log" Mar 21 05:57:25 crc kubenswrapper[4580]: I0321 05:57:25.796246 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7619a3e5-e696-412d-8550-c8c30660eacd/setup-container/0.log" Mar 21 05:57:26 crc kubenswrapper[4580]: I0321 05:57:26.310102 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7619a3e5-e696-412d-8550-c8c30660eacd/rabbitmq/0.log" Mar 21 05:57:26 crc kubenswrapper[4580]: I0321 05:57:26.326027 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7619a3e5-e696-412d-8550-c8c30660eacd/setup-container/0.log" Mar 21 05:57:26 crc kubenswrapper[4580]: I0321 05:57:26.450942 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_364da597-ba18-4d63-b1be-1d925e603515/setup-container/0.log" Mar 21 05:57:26 crc kubenswrapper[4580]: I0321 05:57:26.731573 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-69ghn_bb5e8570-68a2-47c9-bd31-4be0389bd713/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:26 crc kubenswrapper[4580]: I0321 05:57:26.750202 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_364da597-ba18-4d63-b1be-1d925e603515/setup-container/0.log" Mar 21 05:57:26 crc kubenswrapper[4580]: I0321 05:57:26.796432 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_364da597-ba18-4d63-b1be-1d925e603515/rabbitmq/0.log" Mar 21 05:57:27 crc kubenswrapper[4580]: I0321 05:57:27.021204 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-dp8qx_251c60b9-f972-4aec-85af-f00d48e21662/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:27 crc kubenswrapper[4580]: I0321 05:57:27.157290 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-tml9h_668fb1a4-aaf2-4d27-ab22-f7d0789f7cf2/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:27 crc kubenswrapper[4580]: I0321 05:57:27.442992 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-cxplm_b448b2a2-1171-4d9a-b28f-c0d8805134df/ssh-known-hosts-edpm-deployment/0.log" Mar 21 05:57:27 crc kubenswrapper[4580]: I0321 05:57:27.466105 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-jcqfr_30665d01-e41e-4e5e-ad25-f4430eb5866a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:27 crc kubenswrapper[4580]: I0321 05:57:27.799484 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6dbb667f95-c5g4x_21065819-f94d-4cc9-925f-c4be4eeee0d7/proxy-server/0.log" Mar 21 05:57:27 crc kubenswrapper[4580]: I0321 05:57:27.910022 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6dbb667f95-c5g4x_21065819-f94d-4cc9-925f-c4be4eeee0d7/proxy-httpd/0.log" Mar 21 05:57:28 crc kubenswrapper[4580]: I0321 05:57:28.079252 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-kkh4r_3d23c194-d398-4264-8726-c75316c85eff/swift-ring-rebalance/0.log" Mar 21 05:57:28 crc kubenswrapper[4580]: I0321 05:57:28.247095 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/account-auditor/0.log" Mar 21 05:57:28 crc kubenswrapper[4580]: I0321 05:57:28.285905 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/account-reaper/0.log" Mar 21 05:57:28 crc kubenswrapper[4580]: I0321 05:57:28.400720 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/account-replicator/0.log" Mar 21 05:57:28 crc kubenswrapper[4580]: I0321 05:57:28.414367 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/account-server/0.log" Mar 21 05:57:28 crc kubenswrapper[4580]: I0321 05:57:28.582973 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/container-auditor/0.log" Mar 21 05:57:28 crc kubenswrapper[4580]: I0321 05:57:28.603505 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/container-replicator/0.log" Mar 21 05:57:28 crc kubenswrapper[4580]: I0321 05:57:28.758332 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/container-server/0.log" Mar 21 05:57:28 crc kubenswrapper[4580]: I0321 05:57:28.760384 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/container-updater/0.log" Mar 21 05:57:28 crc kubenswrapper[4580]: I0321 05:57:28.871095 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/object-expirer/0.log" Mar 21 05:57:28 crc kubenswrapper[4580]: I0321 05:57:28.890178 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/object-auditor/0.log" Mar 21 05:57:29 crc kubenswrapper[4580]: I0321 05:57:29.085237 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/object-replicator/0.log" Mar 21 05:57:29 crc kubenswrapper[4580]: I0321 05:57:29.128316 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/object-server/0.log" Mar 21 05:57:29 crc kubenswrapper[4580]: I0321 05:57:29.214429 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/rsync/0.log" Mar 21 05:57:29 crc kubenswrapper[4580]: I0321 05:57:29.219834 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/object-updater/0.log" Mar 21 05:57:29 crc kubenswrapper[4580]: I0321 05:57:29.360950 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d59ab798-9ae9-4f47-b58b-36417592eef2/swift-recon-cron/0.log" Mar 21 05:57:29 crc kubenswrapper[4580]: I0321 05:57:29.776101 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_c692d589-bfb1-449b-91ff-8517954bc204/tempest-tests-tempest-tests-runner/0.log" Mar 21 05:57:29 crc kubenswrapper[4580]: I0321 05:57:29.914483 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_25fd8fca-3d1d-4c2e-af01-c5ca004814fd/test-operator-logs-container/0.log" Mar 21 05:57:30 crc kubenswrapper[4580]: I0321 05:57:30.179934 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-6hxl2_01313952-673b-45c9-b24b-0317ed817834/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:30 crc kubenswrapper[4580]: I0321 05:57:30.529803 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-bc666_53375dd9-0a2b-413f-8fa2-1ebd8d63df42/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:57:39 crc kubenswrapper[4580]: I0321 05:57:39.648529 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_226921bf-412a-4dc6-a722-3fcf5ecc7fdc/memcached/0.log" Mar 21 05:57:45 crc kubenswrapper[4580]: I0321 05:57:45.947937 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:57:45 crc kubenswrapper[4580]: I0321 05:57:45.948577 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:57:45 crc kubenswrapper[4580]: I0321 05:57:45.948645 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 05:57:45 crc kubenswrapper[4580]: I0321 05:57:45.949821 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12"} pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:57:45 crc kubenswrapper[4580]: I0321 05:57:45.949922 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" containerID="cri-o://e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" gracePeriod=600 Mar 21 05:57:46 crc kubenswrapper[4580]: E0321 05:57:46.079811 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:57:46 crc kubenswrapper[4580]: I0321 05:57:46.390523 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" exitCode=0 Mar 21 05:57:46 crc kubenswrapper[4580]: I0321 05:57:46.390576 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerDied","Data":"e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12"} Mar 21 05:57:46 crc kubenswrapper[4580]: I0321 05:57:46.390662 4580 scope.go:117] "RemoveContainer" containerID="d3bcda5a760c6a54453a6b1a9b32ee5ceff6b3f290960e7e502c467bc51c6266" Mar 21 05:57:46 crc kubenswrapper[4580]: I0321 05:57:46.391850 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 05:57:46 crc kubenswrapper[4580]: E0321 05:57:46.392249 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:57:59 crc kubenswrapper[4580]: I0321 05:57:59.618215 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 05:57:59 crc kubenswrapper[4580]: E0321 05:57:59.619021 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:58:00 crc kubenswrapper[4580]: I0321 05:58:00.144102 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567878-tlz74"] Mar 21 05:58:00 crc kubenswrapper[4580]: E0321 05:58:00.144713 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a78f57-38b4-4602-a0d0-110594a5442b" containerName="container-00" Mar 21 05:58:00 crc kubenswrapper[4580]: I0321 05:58:00.144740 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a78f57-38b4-4602-a0d0-110594a5442b" containerName="container-00" Mar 21 05:58:00 crc kubenswrapper[4580]: I0321 05:58:00.145053 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="91a78f57-38b4-4602-a0d0-110594a5442b" containerName="container-00" Mar 21 05:58:00 crc kubenswrapper[4580]: I0321 05:58:00.145906 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567878-tlz74" Mar 21 05:58:00 crc kubenswrapper[4580]: I0321 05:58:00.153307 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 05:58:00 crc kubenswrapper[4580]: I0321 05:58:00.153637 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:58:00 crc kubenswrapper[4580]: I0321 05:58:00.153878 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:58:00 crc kubenswrapper[4580]: I0321 05:58:00.155020 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567878-tlz74"] Mar 21 05:58:00 crc kubenswrapper[4580]: I0321 05:58:00.278727 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nncdc\" (UniqueName: \"kubernetes.io/projected/a2dd27b1-8529-4275-9035-ac3c8622f707-kube-api-access-nncdc\") pod \"auto-csr-approver-29567878-tlz74\" (UID: \"a2dd27b1-8529-4275-9035-ac3c8622f707\") " pod="openshift-infra/auto-csr-approver-29567878-tlz74" Mar 21 05:58:00 crc kubenswrapper[4580]: I0321 05:58:00.381375 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nncdc\" (UniqueName: \"kubernetes.io/projected/a2dd27b1-8529-4275-9035-ac3c8622f707-kube-api-access-nncdc\") pod \"auto-csr-approver-29567878-tlz74\" (UID: \"a2dd27b1-8529-4275-9035-ac3c8622f707\") " pod="openshift-infra/auto-csr-approver-29567878-tlz74" Mar 21 05:58:00 crc kubenswrapper[4580]: I0321 05:58:00.401447 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nncdc\" (UniqueName: \"kubernetes.io/projected/a2dd27b1-8529-4275-9035-ac3c8622f707-kube-api-access-nncdc\") pod \"auto-csr-approver-29567878-tlz74\" (UID: \"a2dd27b1-8529-4275-9035-ac3c8622f707\") " pod="openshift-infra/auto-csr-approver-29567878-tlz74" Mar 21 05:58:00 crc kubenswrapper[4580]: I0321 05:58:00.467653 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567878-tlz74" Mar 21 05:58:00 crc kubenswrapper[4580]: I0321 05:58:00.949132 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567878-tlz74"] Mar 21 05:58:00 crc kubenswrapper[4580]: I0321 05:58:00.955327 4580 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:58:01 crc kubenswrapper[4580]: I0321 05:58:01.525954 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567878-tlz74" event={"ID":"a2dd27b1-8529-4275-9035-ac3c8622f707","Type":"ContainerStarted","Data":"6d2730c6c1e1b1c6e7e34ec8f29f89eb3522ad628e25954b6d0c402a6f5fb71b"} Mar 21 05:58:02 crc kubenswrapper[4580]: I0321 05:58:02.540637 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567878-tlz74" event={"ID":"a2dd27b1-8529-4275-9035-ac3c8622f707","Type":"ContainerStarted","Data":"2a78ff4ba5755faf0d13e2677de6ac88c3782d5a73b424adf91c4eabed5f0a03"} Mar 21 05:58:02 crc kubenswrapper[4580]: I0321 05:58:02.570096 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567878-tlz74" podStartSLOduration=1.6442982160000001 podStartE2EDuration="2.570076019s" podCreationTimestamp="2026-03-21 05:58:00 +0000 UTC" firstStartedPulling="2026-03-21 05:58:00.955124401 +0000 UTC m=+3986.037708029" lastFinishedPulling="2026-03-21 05:58:01.880902194 +0000 UTC m=+3986.963485832" observedRunningTime="2026-03-21 05:58:02.566184244 +0000 UTC m=+3987.648767872" watchObservedRunningTime="2026-03-21 05:58:02.570076019 +0000 UTC m=+3987.652659647" Mar 21 05:58:02 crc kubenswrapper[4580]: E0321 05:58:02.768085 4580 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2dd27b1_8529_4275_9035_ac3c8622f707.slice/crio-2a78ff4ba5755faf0d13e2677de6ac88c3782d5a73b424adf91c4eabed5f0a03.scope\": RecentStats: unable to find data in memory cache]" Mar 21 05:58:03 crc kubenswrapper[4580]: I0321 05:58:03.549791 4580 generic.go:334] "Generic (PLEG): container finished" podID="a2dd27b1-8529-4275-9035-ac3c8622f707" containerID="2a78ff4ba5755faf0d13e2677de6ac88c3782d5a73b424adf91c4eabed5f0a03" exitCode=0 Mar 21 05:58:03 crc kubenswrapper[4580]: I0321 05:58:03.549898 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567878-tlz74" event={"ID":"a2dd27b1-8529-4275-9035-ac3c8622f707","Type":"ContainerDied","Data":"2a78ff4ba5755faf0d13e2677de6ac88c3782d5a73b424adf91c4eabed5f0a03"} Mar 21 05:58:04 crc kubenswrapper[4580]: I0321 05:58:04.901380 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6cc65c69fc-lkjq4_a96026f1-4dcb-483a-83da-aecc72e7590c/manager/0.log" Mar 21 05:58:04 crc kubenswrapper[4580]: I0321 05:58:04.933941 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567878-tlz74" Mar 21 05:58:05 crc kubenswrapper[4580]: I0321 05:58:05.083657 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nncdc\" (UniqueName: \"kubernetes.io/projected/a2dd27b1-8529-4275-9035-ac3c8622f707-kube-api-access-nncdc\") pod \"a2dd27b1-8529-4275-9035-ac3c8622f707\" (UID: \"a2dd27b1-8529-4275-9035-ac3c8622f707\") " Mar 21 05:58:05 crc kubenswrapper[4580]: I0321 05:58:05.098002 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2dd27b1-8529-4275-9035-ac3c8622f707-kube-api-access-nncdc" (OuterVolumeSpecName: "kube-api-access-nncdc") pod "a2dd27b1-8529-4275-9035-ac3c8622f707" (UID: "a2dd27b1-8529-4275-9035-ac3c8622f707"). InnerVolumeSpecName "kube-api-access-nncdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:58:05 crc kubenswrapper[4580]: I0321 05:58:05.164082 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4_1afb6781-0e31-4285-aac3-f6ad107c14e5/util/0.log" Mar 21 05:58:05 crc kubenswrapper[4580]: I0321 05:58:05.186988 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nncdc\" (UniqueName: \"kubernetes.io/projected/a2dd27b1-8529-4275-9035-ac3c8622f707-kube-api-access-nncdc\") on node \"crc\" DevicePath \"\"" Mar 21 05:58:05 crc kubenswrapper[4580]: I0321 05:58:05.424584 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4_1afb6781-0e31-4285-aac3-f6ad107c14e5/util/0.log" Mar 21 05:58:05 crc kubenswrapper[4580]: I0321 05:58:05.583090 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567878-tlz74" event={"ID":"a2dd27b1-8529-4275-9035-ac3c8622f707","Type":"ContainerDied","Data":"6d2730c6c1e1b1c6e7e34ec8f29f89eb3522ad628e25954b6d0c402a6f5fb71b"} Mar 21 05:58:05 crc kubenswrapper[4580]: I0321 05:58:05.583340 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d2730c6c1e1b1c6e7e34ec8f29f89eb3522ad628e25954b6d0c402a6f5fb71b" Mar 21 05:58:05 crc kubenswrapper[4580]: I0321 05:58:05.583446 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567878-tlz74" Mar 21 05:58:05 crc kubenswrapper[4580]: I0321 05:58:05.748066 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4_1afb6781-0e31-4285-aac3-f6ad107c14e5/pull/0.log" Mar 21 05:58:05 crc kubenswrapper[4580]: I0321 05:58:05.933214 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6d77645966-vxzhk_5522a0a6-b385-4bf6-990c-5a07561257b0/manager/0.log" Mar 21 05:58:05 crc kubenswrapper[4580]: I0321 05:58:05.999313 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4_1afb6781-0e31-4285-aac3-f6ad107c14e5/pull/0.log" Mar 21 05:58:06 crc kubenswrapper[4580]: I0321 05:58:06.004752 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567872-fhqzx"] Mar 21 05:58:06 crc kubenswrapper[4580]: I0321 05:58:06.013201 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567872-fhqzx"] Mar 21 05:58:06 crc kubenswrapper[4580]: I0321 05:58:06.489120 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4_1afb6781-0e31-4285-aac3-f6ad107c14e5/extract/0.log" Mar 21 05:58:06 crc kubenswrapper[4580]: I0321 05:58:06.521613 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4_1afb6781-0e31-4285-aac3-f6ad107c14e5/util/0.log" Mar 21 05:58:06 crc kubenswrapper[4580]: I0321 05:58:06.590606 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e64d3be5dc8c79ee738f66b6eb537b1fc58ccf6efecbe578c06534d379d9rz4_1afb6781-0e31-4285-aac3-f6ad107c14e5/pull/0.log" Mar 21 05:58:06 crc kubenswrapper[4580]: I0321 05:58:06.899683 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7d559dcdbd-gqlhn_fad28507-ca7b-4452-b392-f0b68e1f9d64/manager/0.log" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.244177 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-66dd9d474d-kw6px_4faee52b-73ab-41d7-a319-33eb67e1aa30/manager/0.log" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.321834 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-64dc66d669-lk8d6_127bc03d-748e-4919-97f8-6f66ab3e2a8a/manager/0.log" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.645550 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16298bac-6153-4f68-8eb4-551cde77ea48" path="/var/lib/kubelet/pods/16298bac-6153-4f68-8eb4-551cde77ea48/volumes" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.646432 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ztgvt"] Mar 21 05:58:07 crc kubenswrapper[4580]: E0321 05:58:07.647999 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2dd27b1-8529-4275-9035-ac3c8622f707" containerName="oc" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.648022 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2dd27b1-8529-4275-9035-ac3c8622f707" containerName="oc" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.648266 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2dd27b1-8529-4275-9035-ac3c8622f707" containerName="oc" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.649613 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.733409 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ztgvt"] Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.736909 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x46c\" (UniqueName: \"kubernetes.io/projected/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-kube-api-access-6x46c\") pod \"redhat-operators-ztgvt\" (UID: \"1af6caf6-a2ab-433e-820d-2cb12c6a5be4\") " pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.737043 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-utilities\") pod \"redhat-operators-ztgvt\" (UID: \"1af6caf6-a2ab-433e-820d-2cb12c6a5be4\") " pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.737185 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-catalog-content\") pod \"redhat-operators-ztgvt\" (UID: \"1af6caf6-a2ab-433e-820d-2cb12c6a5be4\") " pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.838881 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-catalog-content\") pod \"redhat-operators-ztgvt\" (UID: \"1af6caf6-a2ab-433e-820d-2cb12c6a5be4\") " pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.839176 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x46c\" (UniqueName: \"kubernetes.io/projected/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-kube-api-access-6x46c\") pod \"redhat-operators-ztgvt\" (UID: \"1af6caf6-a2ab-433e-820d-2cb12c6a5be4\") " pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.840367 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-utilities\") pod \"redhat-operators-ztgvt\" (UID: \"1af6caf6-a2ab-433e-820d-2cb12c6a5be4\") " pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.841459 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-catalog-content\") pod \"redhat-operators-ztgvt\" (UID: \"1af6caf6-a2ab-433e-820d-2cb12c6a5be4\") " pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.845291 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-utilities\") pod \"redhat-operators-ztgvt\" (UID: \"1af6caf6-a2ab-433e-820d-2cb12c6a5be4\") " pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.868984 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x46c\" (UniqueName: \"kubernetes.io/projected/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-kube-api-access-6x46c\") pod \"redhat-operators-ztgvt\" (UID: \"1af6caf6-a2ab-433e-820d-2cb12c6a5be4\") " pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:07 crc kubenswrapper[4580]: I0321 05:58:07.973641 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:08 crc kubenswrapper[4580]: I0321 05:58:08.141959 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5595c7d6ff-nd42d_b56378b1-33b0-4032-a383-49163ca1811d/manager/0.log" Mar 21 05:58:08 crc kubenswrapper[4580]: I0321 05:58:08.440840 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6b77b7676d-kmqjb_21248f61-caf9-4660-8299-3b10368fa8ad/manager/0.log" Mar 21 05:58:08 crc kubenswrapper[4580]: I0321 05:58:08.598576 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ztgvt"] Mar 21 05:58:08 crc kubenswrapper[4580]: I0321 05:58:08.664463 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztgvt" event={"ID":"1af6caf6-a2ab-433e-820d-2cb12c6a5be4","Type":"ContainerStarted","Data":"119b1d97bd386d3baf53d63ef8ddcb7746dcd00cff1023e93cfba5afc3d96884"} Mar 21 05:58:08 crc kubenswrapper[4580]: I0321 05:58:08.883255 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-784c64596-vdvhl_02dbd40b-11b9-4fca-9617-72b7be489626/manager/0.log" Mar 21 05:58:09 crc kubenswrapper[4580]: I0321 05:58:09.296411 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-fbf7bbb96-mn29l_d45ead43-2f4d-46fc-857f-7e6dbb3e08f6/manager/0.log" Mar 21 05:58:09 crc kubenswrapper[4580]: I0321 05:58:09.438763 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6f5b7bcd4-7dkgm_a216d106-9a69-4143-8766-4e505f2b5a8f/manager/0.log" Mar 21 05:58:09 crc kubenswrapper[4580]: I0321 05:58:09.685679 4580 generic.go:334] "Generic (PLEG): container finished" podID="1af6caf6-a2ab-433e-820d-2cb12c6a5be4" containerID="d860d25a67fdd7fe8e1e09a9f6d73e15c7abcac8b7e63ca89e78a06480b776a1" exitCode=0 Mar 21 05:58:09 crc kubenswrapper[4580]: I0321 05:58:09.685944 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztgvt" event={"ID":"1af6caf6-a2ab-433e-820d-2cb12c6a5be4","Type":"ContainerDied","Data":"d860d25a67fdd7fe8e1e09a9f6d73e15c7abcac8b7e63ca89e78a06480b776a1"} Mar 21 05:58:10 crc kubenswrapper[4580]: I0321 05:58:10.191884 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5cfd84c587-6h2nr_d3c84591-dfcf-48e6-a022-25562660675e/manager/0.log" Mar 21 05:58:10 crc kubenswrapper[4580]: I0321 05:58:10.512166 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6744dd545c-sdpcs_6f7dea10-53e8-4c25-87bc-ffd154d4cb7d/manager/0.log" Mar 21 05:58:10 crc kubenswrapper[4580]: I0321 05:58:10.617602 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 05:58:10 crc kubenswrapper[4580]: E0321 05:58:10.617931 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:58:10 crc kubenswrapper[4580]: I0321 05:58:10.707167 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztgvt" event={"ID":"1af6caf6-a2ab-433e-820d-2cb12c6a5be4","Type":"ContainerStarted","Data":"a75b77dd61b9f28b6222c5bff8454fbfc43232a24e0fa958bb6cd939f33650e4"} Mar 21 05:58:10 crc kubenswrapper[4580]: I0321 05:58:10.783819 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-56f74467c6-2h95r_67d1d125-57c7-4c30-a51a-24db28fb4818/manager/0.log" Mar 21 05:58:10 crc kubenswrapper[4580]: I0321 05:58:10.838209 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-bc5c78db9-d5jll_2535b22b-0bed-4ffd-9430-ca9fb3230c62/manager/0.log" Mar 21 05:58:11 crc kubenswrapper[4580]: I0321 05:58:11.039122 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-qmw5v_2e366c15-abc4-4e05-9054-cd7828e00059/manager/0.log" Mar 21 05:58:11 crc kubenswrapper[4580]: I0321 05:58:11.557081 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-r2tm2_0b8cf1e5-6f84-4595-be65-efc781baa914/registry-server/0.log" Mar 21 05:58:12 crc kubenswrapper[4580]: I0321 05:58:12.048458 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-68b88cfb78-cf92m_d316a65b-0041-42cc-bf46-c6c8801c44a5/operator/0.log" Mar 21 05:58:12 crc kubenswrapper[4580]: I0321 05:58:12.342256 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-846c4cdcb7-kg5cb_bd6a540f-0a1b-4098-8573-b9049d52f49b/manager/0.log" Mar 21 05:58:12 crc kubenswrapper[4580]: I0321 05:58:12.409218 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-659fb58c6b-ln984_0e31c4f0-9b9d-4c10-84de-d15718775f9a/manager/0.log" Mar 21 05:58:12 crc kubenswrapper[4580]: I0321 05:58:12.811256 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5mdkx_92286fdb-b69e-4028-8a93-3517469a731c/operator/0.log" Mar 21 05:58:12 crc kubenswrapper[4580]: I0321 05:58:12.936885 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-867f54bc44-78zl5_8b025526-696f-4d7d-82ed-df03050fa1fd/manager/0.log" Mar 21 05:58:13 crc kubenswrapper[4580]: I0321 05:58:13.147269 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d84559f47-g52cx_2a0c721d-68cd-46de-8292-6bd8373e1106/manager/0.log" Mar 21 05:58:13 crc kubenswrapper[4580]: I0321 05:58:13.169440 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-rqw4v_5992ccfd-4585-49b2-84ff-3f1fe6812a82/manager/0.log" Mar 21 05:58:13 crc kubenswrapper[4580]: I0321 05:58:13.297577 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-684bbdfff8-7nr7w_19a22b87-c6f3-4020-aa11-2a940041f49c/manager/0.log" Mar 21 05:58:13 crc kubenswrapper[4580]: I0321 05:58:13.381610 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-74d6f7b5c-q8jxn_6080f6a7-a68f-447a-bedd-182cd69337b5/manager/0.log" Mar 21 05:58:16 crc kubenswrapper[4580]: I0321 05:58:16.049628 4580 scope.go:117] "RemoveContainer" containerID="34cf123f49cc5b86f6dd1f47ef6222c35c002b386517787bbc0b2033f3312e54" Mar 21 05:58:18 crc kubenswrapper[4580]: I0321 05:58:18.028567 4580 generic.go:334] "Generic (PLEG): container finished" podID="1af6caf6-a2ab-433e-820d-2cb12c6a5be4" containerID="a75b77dd61b9f28b6222c5bff8454fbfc43232a24e0fa958bb6cd939f33650e4" exitCode=0 Mar 21 05:58:18 crc kubenswrapper[4580]: I0321 05:58:18.028632 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztgvt" event={"ID":"1af6caf6-a2ab-433e-820d-2cb12c6a5be4","Type":"ContainerDied","Data":"a75b77dd61b9f28b6222c5bff8454fbfc43232a24e0fa958bb6cd939f33650e4"} Mar 21 05:58:19 crc kubenswrapper[4580]: I0321 05:58:19.040238 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztgvt" event={"ID":"1af6caf6-a2ab-433e-820d-2cb12c6a5be4","Type":"ContainerStarted","Data":"346d498fcae3ad0e53ab33fe0cff8b210d30254f4d62169481b8d1ac996a0854"} Mar 21 05:58:19 crc kubenswrapper[4580]: I0321 05:58:19.070461 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ztgvt" podStartSLOduration=3.3313546880000002 podStartE2EDuration="12.07044426s" podCreationTimestamp="2026-03-21 05:58:07 +0000 UTC" firstStartedPulling="2026-03-21 05:58:09.696813222 +0000 UTC m=+3994.779396850" lastFinishedPulling="2026-03-21 05:58:18.435902794 +0000 UTC m=+4003.518486422" observedRunningTime="2026-03-21 05:58:19.066217446 +0000 UTC m=+4004.148801094" watchObservedRunningTime="2026-03-21 05:58:19.07044426 +0000 UTC m=+4004.153027888" Mar 21 05:58:22 crc kubenswrapper[4580]: I0321 05:58:22.618658 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 05:58:22 crc kubenswrapper[4580]: E0321 05:58:22.619460 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:58:27 crc kubenswrapper[4580]: I0321 05:58:27.974093 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:27 crc kubenswrapper[4580]: I0321 05:58:27.975633 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:29 crc kubenswrapper[4580]: I0321 05:58:29.027955 4580 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ztgvt" podUID="1af6caf6-a2ab-433e-820d-2cb12c6a5be4" containerName="registry-server" probeResult="failure" output=< Mar 21 05:58:29 crc kubenswrapper[4580]: timeout: failed to connect service ":50051" within 1s Mar 21 05:58:29 crc kubenswrapper[4580]: > Mar 21 05:58:36 crc kubenswrapper[4580]: I0321 05:58:36.618470 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 05:58:36 crc kubenswrapper[4580]: E0321 05:58:36.619219 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:58:38 crc kubenswrapper[4580]: I0321 05:58:38.026765 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:38 crc kubenswrapper[4580]: I0321 05:58:38.053853 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jjvsm_48bae9ea-47af-484e-a8c4-b6c3e49438e5/control-plane-machine-set-operator/0.log" Mar 21 05:58:38 crc kubenswrapper[4580]: I0321 05:58:38.080593 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:38 crc kubenswrapper[4580]: I0321 05:58:38.421426 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cvqg4_da00c0bc-2ff1-4b15-be1f-8fac48921976/kube-rbac-proxy/0.log" Mar 21 05:58:38 crc kubenswrapper[4580]: I0321 05:58:38.460893 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cvqg4_da00c0bc-2ff1-4b15-be1f-8fac48921976/machine-api-operator/0.log" Mar 21 05:58:38 crc kubenswrapper[4580]: I0321 05:58:38.824880 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ztgvt"] Mar 21 05:58:39 crc kubenswrapper[4580]: I0321 05:58:39.211512 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ztgvt" podUID="1af6caf6-a2ab-433e-820d-2cb12c6a5be4" containerName="registry-server" containerID="cri-o://346d498fcae3ad0e53ab33fe0cff8b210d30254f4d62169481b8d1ac996a0854" gracePeriod=2 Mar 21 05:58:39 crc kubenswrapper[4580]: I0321 05:58:39.737510 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:39 crc kubenswrapper[4580]: I0321 05:58:39.861820 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x46c\" (UniqueName: \"kubernetes.io/projected/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-kube-api-access-6x46c\") pod \"1af6caf6-a2ab-433e-820d-2cb12c6a5be4\" (UID: \"1af6caf6-a2ab-433e-820d-2cb12c6a5be4\") " Mar 21 05:58:39 crc kubenswrapper[4580]: I0321 05:58:39.861912 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-catalog-content\") pod \"1af6caf6-a2ab-433e-820d-2cb12c6a5be4\" (UID: \"1af6caf6-a2ab-433e-820d-2cb12c6a5be4\") " Mar 21 05:58:39 crc kubenswrapper[4580]: I0321 05:58:39.862054 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-utilities\") pod \"1af6caf6-a2ab-433e-820d-2cb12c6a5be4\" (UID: \"1af6caf6-a2ab-433e-820d-2cb12c6a5be4\") " Mar 21 05:58:39 crc kubenswrapper[4580]: I0321 05:58:39.862708 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-utilities" (OuterVolumeSpecName: "utilities") pod "1af6caf6-a2ab-433e-820d-2cb12c6a5be4" (UID: "1af6caf6-a2ab-433e-820d-2cb12c6a5be4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:58:39 crc kubenswrapper[4580]: I0321 05:58:39.867802 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-kube-api-access-6x46c" (OuterVolumeSpecName: "kube-api-access-6x46c") pod "1af6caf6-a2ab-433e-820d-2cb12c6a5be4" (UID: "1af6caf6-a2ab-433e-820d-2cb12c6a5be4"). InnerVolumeSpecName "kube-api-access-6x46c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:58:39 crc kubenswrapper[4580]: I0321 05:58:39.965606 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:58:39 crc kubenswrapper[4580]: I0321 05:58:39.965650 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x46c\" (UniqueName: \"kubernetes.io/projected/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-kube-api-access-6x46c\") on node \"crc\" DevicePath \"\"" Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.001493 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1af6caf6-a2ab-433e-820d-2cb12c6a5be4" (UID: "1af6caf6-a2ab-433e-820d-2cb12c6a5be4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.067287 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1af6caf6-a2ab-433e-820d-2cb12c6a5be4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.220686 4580 generic.go:334] "Generic (PLEG): container finished" podID="1af6caf6-a2ab-433e-820d-2cb12c6a5be4" containerID="346d498fcae3ad0e53ab33fe0cff8b210d30254f4d62169481b8d1ac996a0854" exitCode=0 Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.220739 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztgvt" event={"ID":"1af6caf6-a2ab-433e-820d-2cb12c6a5be4","Type":"ContainerDied","Data":"346d498fcae3ad0e53ab33fe0cff8b210d30254f4d62169481b8d1ac996a0854"} Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.220765 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztgvt" Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.220803 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztgvt" event={"ID":"1af6caf6-a2ab-433e-820d-2cb12c6a5be4","Type":"ContainerDied","Data":"119b1d97bd386d3baf53d63ef8ddcb7746dcd00cff1023e93cfba5afc3d96884"} Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.220825 4580 scope.go:117] "RemoveContainer" containerID="346d498fcae3ad0e53ab33fe0cff8b210d30254f4d62169481b8d1ac996a0854" Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.267224 4580 scope.go:117] "RemoveContainer" containerID="a75b77dd61b9f28b6222c5bff8454fbfc43232a24e0fa958bb6cd939f33650e4" Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.278391 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ztgvt"] Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.286740 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ztgvt"] Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.288481 4580 scope.go:117] "RemoveContainer" containerID="d860d25a67fdd7fe8e1e09a9f6d73e15c7abcac8b7e63ca89e78a06480b776a1" Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.324965 4580 scope.go:117] "RemoveContainer" containerID="346d498fcae3ad0e53ab33fe0cff8b210d30254f4d62169481b8d1ac996a0854" Mar 21 05:58:40 crc kubenswrapper[4580]: E0321 05:58:40.325731 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346d498fcae3ad0e53ab33fe0cff8b210d30254f4d62169481b8d1ac996a0854\": container with ID starting with 346d498fcae3ad0e53ab33fe0cff8b210d30254f4d62169481b8d1ac996a0854 not found: ID does not exist" containerID="346d498fcae3ad0e53ab33fe0cff8b210d30254f4d62169481b8d1ac996a0854" Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.325763 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346d498fcae3ad0e53ab33fe0cff8b210d30254f4d62169481b8d1ac996a0854"} err="failed to get container status \"346d498fcae3ad0e53ab33fe0cff8b210d30254f4d62169481b8d1ac996a0854\": rpc error: code = NotFound desc = could not find container \"346d498fcae3ad0e53ab33fe0cff8b210d30254f4d62169481b8d1ac996a0854\": container with ID starting with 346d498fcae3ad0e53ab33fe0cff8b210d30254f4d62169481b8d1ac996a0854 not found: ID does not exist" Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.325796 4580 scope.go:117] "RemoveContainer" containerID="a75b77dd61b9f28b6222c5bff8454fbfc43232a24e0fa958bb6cd939f33650e4" Mar 21 05:58:40 crc kubenswrapper[4580]: E0321 05:58:40.326108 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a75b77dd61b9f28b6222c5bff8454fbfc43232a24e0fa958bb6cd939f33650e4\": container with ID starting with a75b77dd61b9f28b6222c5bff8454fbfc43232a24e0fa958bb6cd939f33650e4 not found: ID does not exist" containerID="a75b77dd61b9f28b6222c5bff8454fbfc43232a24e0fa958bb6cd939f33650e4" Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.326132 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a75b77dd61b9f28b6222c5bff8454fbfc43232a24e0fa958bb6cd939f33650e4"} err="failed to get container status \"a75b77dd61b9f28b6222c5bff8454fbfc43232a24e0fa958bb6cd939f33650e4\": rpc error: code = NotFound desc = could not find container \"a75b77dd61b9f28b6222c5bff8454fbfc43232a24e0fa958bb6cd939f33650e4\": container with ID starting with a75b77dd61b9f28b6222c5bff8454fbfc43232a24e0fa958bb6cd939f33650e4 not found: ID does not exist" Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.326147 4580 scope.go:117] "RemoveContainer" containerID="d860d25a67fdd7fe8e1e09a9f6d73e15c7abcac8b7e63ca89e78a06480b776a1" Mar 21 05:58:40 crc kubenswrapper[4580]: E0321 05:58:40.326370 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d860d25a67fdd7fe8e1e09a9f6d73e15c7abcac8b7e63ca89e78a06480b776a1\": container with ID starting with d860d25a67fdd7fe8e1e09a9f6d73e15c7abcac8b7e63ca89e78a06480b776a1 not found: ID does not exist" containerID="d860d25a67fdd7fe8e1e09a9f6d73e15c7abcac8b7e63ca89e78a06480b776a1" Mar 21 05:58:40 crc kubenswrapper[4580]: I0321 05:58:40.326394 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d860d25a67fdd7fe8e1e09a9f6d73e15c7abcac8b7e63ca89e78a06480b776a1"} err="failed to get container status \"d860d25a67fdd7fe8e1e09a9f6d73e15c7abcac8b7e63ca89e78a06480b776a1\": rpc error: code = NotFound desc = could not find container \"d860d25a67fdd7fe8e1e09a9f6d73e15c7abcac8b7e63ca89e78a06480b776a1\": container with ID starting with d860d25a67fdd7fe8e1e09a9f6d73e15c7abcac8b7e63ca89e78a06480b776a1 not found: ID does not exist" Mar 21 05:58:41 crc kubenswrapper[4580]: I0321 05:58:41.627690 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1af6caf6-a2ab-433e-820d-2cb12c6a5be4" path="/var/lib/kubelet/pods/1af6caf6-a2ab-433e-820d-2cb12c6a5be4/volumes" Mar 21 05:58:48 crc kubenswrapper[4580]: I0321 05:58:48.618390 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 05:58:48 crc kubenswrapper[4580]: E0321 05:58:48.619194 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:58:54 crc kubenswrapper[4580]: I0321 05:58:54.945423 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-ssv6b_cc874172-d2d5-4811-8b52-8822da8cb97f/cert-manager-controller/0.log" Mar 21 05:58:55 crc kubenswrapper[4580]: I0321 05:58:55.210572 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-plhxw_cdd054c6-1468-4d34-866c-612b69c7bb4f/cert-manager-cainjector/0.log" Mar 21 05:58:55 crc kubenswrapper[4580]: I0321 05:58:55.505918 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-kcmc2_7154e53a-0974-4463-9d9a-20cea09f0e94/cert-manager-webhook/0.log" Mar 21 05:58:59 crc kubenswrapper[4580]: I0321 05:58:59.618645 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 05:58:59 crc kubenswrapper[4580]: E0321 05:58:59.619506 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:59:09 crc kubenswrapper[4580]: I0321 05:59:09.564020 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-6t69s_3f4e68cb-70a1-40bd-815d-e35e0a3337a0/nmstate-console-plugin/0.log" Mar 21 05:59:09 crc kubenswrapper[4580]: I0321 05:59:09.755341 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-ftlkm_c458e5d3-8c0a-4135-aba8-54854b16c411/nmstate-handler/0.log" Mar 21 05:59:09 crc kubenswrapper[4580]: I0321 05:59:09.781040 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-zfbrs_1b699e53-ece3-49d9-9f68-c3558aef7892/kube-rbac-proxy/0.log" Mar 21 05:59:09 crc kubenswrapper[4580]: I0321 05:59:09.790170 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-zfbrs_1b699e53-ece3-49d9-9f68-c3558aef7892/nmstate-metrics/0.log" Mar 21 05:59:09 crc kubenswrapper[4580]: I0321 05:59:09.978693 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-w4mjf_449ae922-f55a-437e-b18a-d6e2700cc02e/nmstate-webhook/0.log" Mar 21 05:59:10 crc kubenswrapper[4580]: I0321 05:59:10.042764 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-2fctj_b980d3d5-4f25-47c0-9679-8662b237e1b7/nmstate-operator/0.log" Mar 21 05:59:10 crc kubenswrapper[4580]: I0321 05:59:10.618762 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 05:59:10 crc kubenswrapper[4580]: E0321 05:59:10.619034 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:59:25 crc kubenswrapper[4580]: I0321 05:59:25.625276 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 05:59:25 crc kubenswrapper[4580]: E0321 05:59:25.625887 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:59:36 crc kubenswrapper[4580]: I0321 05:59:36.619071 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 05:59:36 crc kubenswrapper[4580]: E0321 05:59:36.620269 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 05:59:41 crc kubenswrapper[4580]: I0321 05:59:41.638851 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-vlfhg_ab7c431f-e254-4c36-a240-15ec5cbb14e9/kube-rbac-proxy/0.log" Mar 21 05:59:41 crc kubenswrapper[4580]: I0321 05:59:41.722562 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-vlfhg_ab7c431f-e254-4c36-a240-15ec5cbb14e9/controller/0.log" Mar 21 05:59:42 crc kubenswrapper[4580]: I0321 05:59:42.661862 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-frr-files/0.log" Mar 21 05:59:42 crc kubenswrapper[4580]: I0321 05:59:42.931931 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-frr-files/0.log" Mar 21 05:59:43 crc kubenswrapper[4580]: I0321 05:59:43.015646 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-reloader/0.log" Mar 21 05:59:43 crc kubenswrapper[4580]: I0321 05:59:43.052668 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-metrics/0.log" Mar 21 05:59:43 crc kubenswrapper[4580]: I0321 05:59:43.103472 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-reloader/0.log" Mar 21 05:59:43 crc kubenswrapper[4580]: I0321 05:59:43.271121 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-metrics/0.log" Mar 21 05:59:43 crc kubenswrapper[4580]: I0321 05:59:43.272394 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-reloader/0.log" Mar 21 05:59:43 crc kubenswrapper[4580]: I0321 05:59:43.302730 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-frr-files/0.log" Mar 21 05:59:43 crc kubenswrapper[4580]: I0321 05:59:43.384202 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-metrics/0.log" Mar 21 05:59:43 crc kubenswrapper[4580]: I0321 05:59:43.600242 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-reloader/0.log" Mar 21 05:59:43 crc kubenswrapper[4580]: I0321 05:59:43.603354 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/controller/0.log" Mar 21 05:59:43 crc kubenswrapper[4580]: I0321 05:59:43.638058 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-frr-files/0.log" Mar 21 05:59:43 crc kubenswrapper[4580]: I0321 05:59:43.684038 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/cp-metrics/0.log" Mar 21 05:59:43 crc kubenswrapper[4580]: I0321 05:59:43.835896 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/frr-metrics/0.log" Mar 21 05:59:43 crc kubenswrapper[4580]: I0321 05:59:43.905697 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/kube-rbac-proxy/0.log" Mar 21 05:59:44 crc kubenswrapper[4580]: I0321 05:59:44.389626 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/kube-rbac-proxy-frr/0.log" Mar 21 05:59:44 crc kubenswrapper[4580]: I0321 05:59:44.467157 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/reloader/0.log" Mar 21 05:59:44 crc kubenswrapper[4580]: I0321 05:59:44.781297 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-qdhgl_254226c1-a87d-4bca-a3d4-a909452fa9ac/frr-k8s-webhook-server/0.log" Mar 21 05:59:44 crc kubenswrapper[4580]: I0321 05:59:44.970404 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-cf889bd6-5qrxw_22ee4637-a40f-4200-be5a-679e0912f4cf/manager/0.log" Mar 21 05:59:45 crc kubenswrapper[4580]: I0321 05:59:45.136794 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7d5b654677-cwqcf_1d9a7806-fa8d-4106-9241-a32afafc5eb7/webhook-server/0.log" Mar 21 05:59:45 crc kubenswrapper[4580]: I0321 05:59:45.413223 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bzwqp_e974ab22-96b8-4617-9fcf-db94114f0b0d/frr/0.log" Mar 21 05:59:45 crc kubenswrapper[4580]: I0321 05:59:45.492052 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-smqrq_d2b75e08-f7c9-47c8-9b08-f574bb92461d/kube-rbac-proxy/0.log" Mar 21 05:59:45 crc kubenswrapper[4580]: I0321 05:59:45.862460 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-smqrq_d2b75e08-f7c9-47c8-9b08-f574bb92461d/speaker/0.log" Mar 21 05:59:51 crc kubenswrapper[4580]: I0321 05:59:51.618221 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 05:59:51 crc kubenswrapper[4580]: E0321 05:59:51.619233 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.153774 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567880-q9vcb"] Mar 21 06:00:00 crc kubenswrapper[4580]: E0321 06:00:00.154745 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af6caf6-a2ab-433e-820d-2cb12c6a5be4" containerName="registry-server" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.154758 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af6caf6-a2ab-433e-820d-2cb12c6a5be4" containerName="registry-server" Mar 21 06:00:00 crc kubenswrapper[4580]: E0321 06:00:00.154817 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af6caf6-a2ab-433e-820d-2cb12c6a5be4" containerName="extract-utilities" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.154826 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af6caf6-a2ab-433e-820d-2cb12c6a5be4" containerName="extract-utilities" Mar 21 06:00:00 crc kubenswrapper[4580]: E0321 06:00:00.154837 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af6caf6-a2ab-433e-820d-2cb12c6a5be4" containerName="extract-content" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.154843 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af6caf6-a2ab-433e-820d-2cb12c6a5be4" containerName="extract-content" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.155051 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="1af6caf6-a2ab-433e-820d-2cb12c6a5be4" containerName="registry-server" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.155687 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567880-q9vcb" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.162920 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.163977 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.164701 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f"] Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.165407 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.174309 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.181892 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f"] Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.182581 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.182930 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.195630 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567880-q9vcb"] Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.250513 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78q5q\" (UniqueName: \"kubernetes.io/projected/5b24ee55-d60e-4fc2-bb1e-2d42e45b8453-kube-api-access-78q5q\") pod \"auto-csr-approver-29567880-q9vcb\" (UID: \"5b24ee55-d60e-4fc2-bb1e-2d42e45b8453\") " pod="openshift-infra/auto-csr-approver-29567880-q9vcb" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.354197 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cf79a68-1b07-490d-952e-523766e34e82-config-volume\") pod \"collect-profiles-29567880-62d9f\" (UID: \"8cf79a68-1b07-490d-952e-523766e34e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.354396 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78q5q\" (UniqueName: \"kubernetes.io/projected/5b24ee55-d60e-4fc2-bb1e-2d42e45b8453-kube-api-access-78q5q\") pod \"auto-csr-approver-29567880-q9vcb\" (UID: \"5b24ee55-d60e-4fc2-bb1e-2d42e45b8453\") " pod="openshift-infra/auto-csr-approver-29567880-q9vcb" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.354435 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cf79a68-1b07-490d-952e-523766e34e82-secret-volume\") pod \"collect-profiles-29567880-62d9f\" (UID: \"8cf79a68-1b07-490d-952e-523766e34e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.354520 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5v5z\" (UniqueName: \"kubernetes.io/projected/8cf79a68-1b07-490d-952e-523766e34e82-kube-api-access-s5v5z\") pod \"collect-profiles-29567880-62d9f\" (UID: \"8cf79a68-1b07-490d-952e-523766e34e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.387944 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78q5q\" (UniqueName: \"kubernetes.io/projected/5b24ee55-d60e-4fc2-bb1e-2d42e45b8453-kube-api-access-78q5q\") pod \"auto-csr-approver-29567880-q9vcb\" (UID: \"5b24ee55-d60e-4fc2-bb1e-2d42e45b8453\") " pod="openshift-infra/auto-csr-approver-29567880-q9vcb" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.456211 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cf79a68-1b07-490d-952e-523766e34e82-secret-volume\") pod \"collect-profiles-29567880-62d9f\" (UID: \"8cf79a68-1b07-490d-952e-523766e34e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.456322 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5v5z\" (UniqueName: \"kubernetes.io/projected/8cf79a68-1b07-490d-952e-523766e34e82-kube-api-access-s5v5z\") pod \"collect-profiles-29567880-62d9f\" (UID: \"8cf79a68-1b07-490d-952e-523766e34e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.456379 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cf79a68-1b07-490d-952e-523766e34e82-config-volume\") pod \"collect-profiles-29567880-62d9f\" (UID: \"8cf79a68-1b07-490d-952e-523766e34e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.457275 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cf79a68-1b07-490d-952e-523766e34e82-config-volume\") pod \"collect-profiles-29567880-62d9f\" (UID: \"8cf79a68-1b07-490d-952e-523766e34e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.460242 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cf79a68-1b07-490d-952e-523766e34e82-secret-volume\") pod \"collect-profiles-29567880-62d9f\" (UID: \"8cf79a68-1b07-490d-952e-523766e34e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.486595 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5v5z\" (UniqueName: \"kubernetes.io/projected/8cf79a68-1b07-490d-952e-523766e34e82-kube-api-access-s5v5z\") pod \"collect-profiles-29567880-62d9f\" (UID: \"8cf79a68-1b07-490d-952e-523766e34e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.502623 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567880-q9vcb" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.521495 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb_20e4a1fa-c6ce-4a58-a9b9-a982a19c5243/util/0.log" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.532892 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.868995 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fvnl2"] Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.871980 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.902916 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fvnl2"] Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.934588 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb_20e4a1fa-c6ce-4a58-a9b9-a982a19c5243/pull/0.log" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.970219 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kmlz\" (UniqueName: \"kubernetes.io/projected/aa658cfd-9979-4781-ac62-fa46dd78c706-kube-api-access-6kmlz\") pod \"community-operators-fvnl2\" (UID: \"aa658cfd-9979-4781-ac62-fa46dd78c706\") " pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.970371 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa658cfd-9979-4781-ac62-fa46dd78c706-utilities\") pod \"community-operators-fvnl2\" (UID: \"aa658cfd-9979-4781-ac62-fa46dd78c706\") " pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.970498 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa658cfd-9979-4781-ac62-fa46dd78c706-catalog-content\") pod \"community-operators-fvnl2\" (UID: \"aa658cfd-9979-4781-ac62-fa46dd78c706\") " pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:00 crc kubenswrapper[4580]: I0321 06:00:00.987941 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb_20e4a1fa-c6ce-4a58-a9b9-a982a19c5243/util/0.log" Mar 21 06:00:01 crc kubenswrapper[4580]: I0321 06:00:01.076880 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa658cfd-9979-4781-ac62-fa46dd78c706-catalog-content\") pod \"community-operators-fvnl2\" (UID: \"aa658cfd-9979-4781-ac62-fa46dd78c706\") " pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:01 crc kubenswrapper[4580]: I0321 06:00:01.076947 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kmlz\" (UniqueName: \"kubernetes.io/projected/aa658cfd-9979-4781-ac62-fa46dd78c706-kube-api-access-6kmlz\") pod \"community-operators-fvnl2\" (UID: \"aa658cfd-9979-4781-ac62-fa46dd78c706\") " pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:01 crc kubenswrapper[4580]: I0321 06:00:01.077031 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa658cfd-9979-4781-ac62-fa46dd78c706-utilities\") pod \"community-operators-fvnl2\" (UID: \"aa658cfd-9979-4781-ac62-fa46dd78c706\") " pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:01 crc kubenswrapper[4580]: I0321 06:00:01.077540 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa658cfd-9979-4781-ac62-fa46dd78c706-catalog-content\") pod \"community-operators-fvnl2\" (UID: \"aa658cfd-9979-4781-ac62-fa46dd78c706\") " pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:01 crc kubenswrapper[4580]: I0321 06:00:01.078124 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa658cfd-9979-4781-ac62-fa46dd78c706-utilities\") pod \"community-operators-fvnl2\" (UID: \"aa658cfd-9979-4781-ac62-fa46dd78c706\") " pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:01 crc kubenswrapper[4580]: I0321 06:00:01.119060 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kmlz\" (UniqueName: \"kubernetes.io/projected/aa658cfd-9979-4781-ac62-fa46dd78c706-kube-api-access-6kmlz\") pod \"community-operators-fvnl2\" (UID: \"aa658cfd-9979-4781-ac62-fa46dd78c706\") " pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:01 crc kubenswrapper[4580]: I0321 06:00:01.133110 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb_20e4a1fa-c6ce-4a58-a9b9-a982a19c5243/pull/0.log" Mar 21 06:00:01 crc kubenswrapper[4580]: I0321 06:00:01.173381 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f"] Mar 21 06:00:01 crc kubenswrapper[4580]: I0321 06:00:01.202011 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:01 crc kubenswrapper[4580]: I0321 06:00:01.251521 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567880-q9vcb"] Mar 21 06:00:01 crc kubenswrapper[4580]: W0321 06:00:01.302732 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b24ee55_d60e_4fc2_bb1e_2d42e45b8453.slice/crio-07953a0d4b883fb0db5e8eb32fd5f693d54b05e3a9ace67e3ae34d262f811068 WatchSource:0}: Error finding container 07953a0d4b883fb0db5e8eb32fd5f693d54b05e3a9ace67e3ae34d262f811068: Status 404 returned error can't find the container with id 07953a0d4b883fb0db5e8eb32fd5f693d54b05e3a9ace67e3ae34d262f811068 Mar 21 06:00:01 crc kubenswrapper[4580]: I0321 06:00:01.455644 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb_20e4a1fa-c6ce-4a58-a9b9-a982a19c5243/util/0.log" Mar 21 06:00:01 crc kubenswrapper[4580]: I0321 06:00:01.820414 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb_20e4a1fa-c6ce-4a58-a9b9-a982a19c5243/pull/0.log" Mar 21 06:00:01 crc kubenswrapper[4580]: I0321 06:00:01.900505 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rg7zb_20e4a1fa-c6ce-4a58-a9b9-a982a19c5243/extract/0.log" Mar 21 06:00:01 crc kubenswrapper[4580]: I0321 06:00:01.924182 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fvnl2"] Mar 21 06:00:02 crc kubenswrapper[4580]: I0321 06:00:02.047754 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567880-q9vcb" event={"ID":"5b24ee55-d60e-4fc2-bb1e-2d42e45b8453","Type":"ContainerStarted","Data":"07953a0d4b883fb0db5e8eb32fd5f693d54b05e3a9ace67e3ae34d262f811068"} Mar 21 06:00:02 crc kubenswrapper[4580]: I0321 06:00:02.051417 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvnl2" event={"ID":"aa658cfd-9979-4781-ac62-fa46dd78c706","Type":"ContainerStarted","Data":"56d8e627b1424e92a7c64c3301bc3f5b438418efbf8691abb56b282467d58181"} Mar 21 06:00:02 crc kubenswrapper[4580]: I0321 06:00:02.062435 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" event={"ID":"8cf79a68-1b07-490d-952e-523766e34e82","Type":"ContainerStarted","Data":"816abbe0860ba9dea027d742fdb480e4a770e0ecf89c3d2c60af40ec754fdcca"} Mar 21 06:00:02 crc kubenswrapper[4580]: I0321 06:00:02.062476 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" event={"ID":"8cf79a68-1b07-490d-952e-523766e34e82","Type":"ContainerStarted","Data":"d119f4a306e48d74461dd2c55dbf06e72ef548668f6163ceb2b6c198a1d4537e"} Mar 21 06:00:02 crc kubenswrapper[4580]: I0321 06:00:02.065549 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv_66538d86-0387-4c8b-b266-4ee60f1cbd90/util/0.log" Mar 21 06:00:02 crc kubenswrapper[4580]: I0321 06:00:02.083405 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" podStartSLOduration=2.08338517 podStartE2EDuration="2.08338517s" podCreationTimestamp="2026-03-21 06:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 06:00:02.078430307 +0000 UTC m=+4107.161013945" watchObservedRunningTime="2026-03-21 06:00:02.08338517 +0000 UTC m=+4107.165968798" Mar 21 06:00:02 crc kubenswrapper[4580]: I0321 06:00:02.323898 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv_66538d86-0387-4c8b-b266-4ee60f1cbd90/pull/0.log" Mar 21 06:00:02 crc kubenswrapper[4580]: I0321 06:00:02.434767 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv_66538d86-0387-4c8b-b266-4ee60f1cbd90/pull/0.log" Mar 21 06:00:02 crc kubenswrapper[4580]: I0321 06:00:02.448896 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv_66538d86-0387-4c8b-b266-4ee60f1cbd90/util/0.log" Mar 21 06:00:02 crc kubenswrapper[4580]: I0321 06:00:02.832285 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv_66538d86-0387-4c8b-b266-4ee60f1cbd90/util/0.log" Mar 21 06:00:02 crc kubenswrapper[4580]: I0321 06:00:02.839050 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv_66538d86-0387-4c8b-b266-4ee60f1cbd90/pull/0.log" Mar 21 06:00:02 crc kubenswrapper[4580]: I0321 06:00:02.916872 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17lcsv_66538d86-0387-4c8b-b266-4ee60f1cbd90/extract/0.log" Mar 21 06:00:03 crc kubenswrapper[4580]: I0321 06:00:03.074739 4580 generic.go:334] "Generic (PLEG): container finished" podID="aa658cfd-9979-4781-ac62-fa46dd78c706" containerID="abc10d268e8d90085325f3bd308d215f806c913a2a2b960b51bf88ed2f32a6a5" exitCode=0 Mar 21 06:00:03 crc kubenswrapper[4580]: I0321 06:00:03.075069 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvnl2" event={"ID":"aa658cfd-9979-4781-ac62-fa46dd78c706","Type":"ContainerDied","Data":"abc10d268e8d90085325f3bd308d215f806c913a2a2b960b51bf88ed2f32a6a5"} Mar 21 06:00:03 crc kubenswrapper[4580]: I0321 06:00:03.079168 4580 generic.go:334] "Generic (PLEG): container finished" podID="8cf79a68-1b07-490d-952e-523766e34e82" containerID="816abbe0860ba9dea027d742fdb480e4a770e0ecf89c3d2c60af40ec754fdcca" exitCode=0 Mar 21 06:00:03 crc kubenswrapper[4580]: I0321 06:00:03.079227 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" event={"ID":"8cf79a68-1b07-490d-952e-523766e34e82","Type":"ContainerDied","Data":"816abbe0860ba9dea027d742fdb480e4a770e0ecf89c3d2c60af40ec754fdcca"} Mar 21 06:00:03 crc kubenswrapper[4580]: I0321 06:00:03.085567 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ppl4t_371f2b05-72f1-4289-b499-4490d84d0d38/extract-utilities/0.log" Mar 21 06:00:03 crc kubenswrapper[4580]: I0321 06:00:03.380231 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ppl4t_371f2b05-72f1-4289-b499-4490d84d0d38/extract-content/0.log" Mar 21 06:00:03 crc kubenswrapper[4580]: I0321 06:00:03.392251 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ppl4t_371f2b05-72f1-4289-b499-4490d84d0d38/extract-utilities/0.log" Mar 21 06:00:03 crc kubenswrapper[4580]: I0321 06:00:03.418918 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ppl4t_371f2b05-72f1-4289-b499-4490d84d0d38/extract-content/0.log" Mar 21 06:00:03 crc kubenswrapper[4580]: I0321 06:00:03.643158 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ppl4t_371f2b05-72f1-4289-b499-4490d84d0d38/extract-content/0.log" Mar 21 06:00:03 crc kubenswrapper[4580]: I0321 06:00:03.677255 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ppl4t_371f2b05-72f1-4289-b499-4490d84d0d38/extract-utilities/0.log" Mar 21 06:00:04 crc kubenswrapper[4580]: I0321 06:00:04.019161 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gbbtm_fa2c2b42-95d7-47dd-b5c3-47ef8689c50c/extract-utilities/0.log" Mar 21 06:00:04 crc kubenswrapper[4580]: I0321 06:00:04.160847 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gbbtm_fa2c2b42-95d7-47dd-b5c3-47ef8689c50c/extract-content/0.log" Mar 21 06:00:04 crc kubenswrapper[4580]: I0321 06:00:04.229260 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ppl4t_371f2b05-72f1-4289-b499-4490d84d0d38/registry-server/0.log" Mar 21 06:00:04 crc kubenswrapper[4580]: I0321 06:00:04.274273 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gbbtm_fa2c2b42-95d7-47dd-b5c3-47ef8689c50c/extract-utilities/0.log" Mar 21 06:00:04 crc kubenswrapper[4580]: I0321 06:00:04.347299 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gbbtm_fa2c2b42-95d7-47dd-b5c3-47ef8689c50c/extract-content/0.log" Mar 21 06:00:04 crc kubenswrapper[4580]: I0321 06:00:04.930698 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.069403 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gbbtm_fa2c2b42-95d7-47dd-b5c3-47ef8689c50c/extract-content/0.log" Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.111550 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5v5z\" (UniqueName: \"kubernetes.io/projected/8cf79a68-1b07-490d-952e-523766e34e82-kube-api-access-s5v5z\") pod \"8cf79a68-1b07-490d-952e-523766e34e82\" (UID: \"8cf79a68-1b07-490d-952e-523766e34e82\") " Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.111718 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cf79a68-1b07-490d-952e-523766e34e82-secret-volume\") pod \"8cf79a68-1b07-490d-952e-523766e34e82\" (UID: \"8cf79a68-1b07-490d-952e-523766e34e82\") " Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.111794 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cf79a68-1b07-490d-952e-523766e34e82-config-volume\") pod \"8cf79a68-1b07-490d-952e-523766e34e82\" (UID: \"8cf79a68-1b07-490d-952e-523766e34e82\") " Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.113568 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cf79a68-1b07-490d-952e-523766e34e82-config-volume" (OuterVolumeSpecName: "config-volume") pod "8cf79a68-1b07-490d-952e-523766e34e82" (UID: "8cf79a68-1b07-490d-952e-523766e34e82"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.115921 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" event={"ID":"8cf79a68-1b07-490d-952e-523766e34e82","Type":"ContainerDied","Data":"d119f4a306e48d74461dd2c55dbf06e72ef548668f6163ceb2b6c198a1d4537e"} Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.115962 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d119f4a306e48d74461dd2c55dbf06e72ef548668f6163ceb2b6c198a1d4537e" Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.115993 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-62d9f" Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.122291 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cf79a68-1b07-490d-952e-523766e34e82-kube-api-access-s5v5z" (OuterVolumeSpecName: "kube-api-access-s5v5z") pod "8cf79a68-1b07-490d-952e-523766e34e82" (UID: "8cf79a68-1b07-490d-952e-523766e34e82"). InnerVolumeSpecName "kube-api-access-s5v5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.134087 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf79a68-1b07-490d-952e-523766e34e82-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8cf79a68-1b07-490d-952e-523766e34e82" (UID: "8cf79a68-1b07-490d-952e-523766e34e82"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.214045 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5v5z\" (UniqueName: \"kubernetes.io/projected/8cf79a68-1b07-490d-952e-523766e34e82-kube-api-access-s5v5z\") on node \"crc\" DevicePath \"\"" Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.214074 4580 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cf79a68-1b07-490d-952e-523766e34e82-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.214084 4580 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cf79a68-1b07-490d-952e-523766e34e82-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.220460 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gbbtm_fa2c2b42-95d7-47dd-b5c3-47ef8689c50c/extract-utilities/0.log" Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.362326 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-p95jn_6ad5649a-1bef-41a6-aeaa-73f2850df16a/marketplace-operator/0.log" Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.627589 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 06:00:05 crc kubenswrapper[4580]: E0321 06:00:05.627897 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.740755 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ggsnm_2307470f-41d9-48a7-bfce-d96d8a13c568/extract-utilities/0.log" Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.746923 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gbbtm_fa2c2b42-95d7-47dd-b5c3-47ef8689c50c/registry-server/0.log" Mar 21 06:00:05 crc kubenswrapper[4580]: I0321 06:00:05.988471 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ggsnm_2307470f-41d9-48a7-bfce-d96d8a13c568/extract-content/0.log" Mar 21 06:00:06 crc kubenswrapper[4580]: I0321 06:00:06.022570 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ggsnm_2307470f-41d9-48a7-bfce-d96d8a13c568/extract-utilities/0.log" Mar 21 06:00:06 crc kubenswrapper[4580]: I0321 06:00:06.042629 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7"] Mar 21 06:00:06 crc kubenswrapper[4580]: I0321 06:00:06.057125 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-88td7"] Mar 21 06:00:06 crc kubenswrapper[4580]: I0321 06:00:06.108266 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ggsnm_2307470f-41d9-48a7-bfce-d96d8a13c568/extract-content/0.log" Mar 21 06:00:06 crc kubenswrapper[4580]: I0321 06:00:06.127286 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567880-q9vcb" event={"ID":"5b24ee55-d60e-4fc2-bb1e-2d42e45b8453","Type":"ContainerStarted","Data":"c499ab666c033a7376d6883491e99934b783d5bb344d9ca38e3935762c28405e"} Mar 21 06:00:06 crc kubenswrapper[4580]: I0321 06:00:06.130863 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvnl2" event={"ID":"aa658cfd-9979-4781-ac62-fa46dd78c706","Type":"ContainerStarted","Data":"3bc36c57f2a909127dd0edd509465bc5e4ba1e750d14a53b3526ea9398e9cf19"} Mar 21 06:00:06 crc kubenswrapper[4580]: I0321 06:00:06.181237 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567880-q9vcb" podStartSLOduration=3.336254581 podStartE2EDuration="6.181217086s" podCreationTimestamp="2026-03-21 06:00:00 +0000 UTC" firstStartedPulling="2026-03-21 06:00:01.315092841 +0000 UTC m=+4106.397676469" lastFinishedPulling="2026-03-21 06:00:04.160055346 +0000 UTC m=+4109.242638974" observedRunningTime="2026-03-21 06:00:06.143519199 +0000 UTC m=+4111.226102847" watchObservedRunningTime="2026-03-21 06:00:06.181217086 +0000 UTC m=+4111.263800724" Mar 21 06:00:06 crc kubenswrapper[4580]: I0321 06:00:06.370896 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ggsnm_2307470f-41d9-48a7-bfce-d96d8a13c568/extract-content/0.log" Mar 21 06:00:06 crc kubenswrapper[4580]: I0321 06:00:06.441687 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ggsnm_2307470f-41d9-48a7-bfce-d96d8a13c568/extract-utilities/0.log" Mar 21 06:00:06 crc kubenswrapper[4580]: I0321 06:00:06.519090 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ggsnm_2307470f-41d9-48a7-bfce-d96d8a13c568/registry-server/0.log" Mar 21 06:00:06 crc kubenswrapper[4580]: I0321 06:00:06.671274 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmvk5_48b0cf7d-100c-4953-88d7-c4775e45c45d/extract-utilities/0.log" Mar 21 06:00:06 crc kubenswrapper[4580]: I0321 06:00:06.889185 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmvk5_48b0cf7d-100c-4953-88d7-c4775e45c45d/extract-content/0.log" Mar 21 06:00:06 crc kubenswrapper[4580]: I0321 06:00:06.942024 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmvk5_48b0cf7d-100c-4953-88d7-c4775e45c45d/extract-utilities/0.log" Mar 21 06:00:06 crc kubenswrapper[4580]: I0321 06:00:06.953050 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmvk5_48b0cf7d-100c-4953-88d7-c4775e45c45d/extract-content/0.log" Mar 21 06:00:07 crc kubenswrapper[4580]: I0321 06:00:07.116169 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmvk5_48b0cf7d-100c-4953-88d7-c4775e45c45d/extract-utilities/0.log" Mar 21 06:00:07 crc kubenswrapper[4580]: I0321 06:00:07.143994 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmvk5_48b0cf7d-100c-4953-88d7-c4775e45c45d/extract-content/0.log" Mar 21 06:00:07 crc kubenswrapper[4580]: I0321 06:00:07.144091 4580 generic.go:334] "Generic (PLEG): container finished" podID="5b24ee55-d60e-4fc2-bb1e-2d42e45b8453" containerID="c499ab666c033a7376d6883491e99934b783d5bb344d9ca38e3935762c28405e" exitCode=0 Mar 21 06:00:07 crc kubenswrapper[4580]: I0321 06:00:07.144145 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567880-q9vcb" event={"ID":"5b24ee55-d60e-4fc2-bb1e-2d42e45b8453","Type":"ContainerDied","Data":"c499ab666c033a7376d6883491e99934b783d5bb344d9ca38e3935762c28405e"} Mar 21 06:00:07 crc kubenswrapper[4580]: I0321 06:00:07.147309 4580 generic.go:334] "Generic (PLEG): container finished" podID="aa658cfd-9979-4781-ac62-fa46dd78c706" containerID="3bc36c57f2a909127dd0edd509465bc5e4ba1e750d14a53b3526ea9398e9cf19" exitCode=0 Mar 21 06:00:07 crc kubenswrapper[4580]: I0321 06:00:07.147354 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvnl2" event={"ID":"aa658cfd-9979-4781-ac62-fa46dd78c706","Type":"ContainerDied","Data":"3bc36c57f2a909127dd0edd509465bc5e4ba1e750d14a53b3526ea9398e9cf19"} Mar 21 06:00:07 crc kubenswrapper[4580]: I0321 06:00:07.630926 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa0c7395-a399-416c-aa03-231a27008b0d" path="/var/lib/kubelet/pods/fa0c7395-a399-416c-aa03-231a27008b0d/volumes" Mar 21 06:00:07 crc kubenswrapper[4580]: I0321 06:00:07.802877 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmvk5_48b0cf7d-100c-4953-88d7-c4775e45c45d/registry-server/0.log" Mar 21 06:00:08 crc kubenswrapper[4580]: I0321 06:00:08.161273 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvnl2" event={"ID":"aa658cfd-9979-4781-ac62-fa46dd78c706","Type":"ContainerStarted","Data":"980afae38324b3245c4542692b528e0a05cab0ba3085e373de75bdb737c42cb8"} Mar 21 06:00:08 crc kubenswrapper[4580]: I0321 06:00:08.191660 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fvnl2" podStartSLOduration=3.588267743 podStartE2EDuration="8.191634176s" podCreationTimestamp="2026-03-21 06:00:00 +0000 UTC" firstStartedPulling="2026-03-21 06:00:03.077507666 +0000 UTC m=+4108.160091294" lastFinishedPulling="2026-03-21 06:00:07.680874099 +0000 UTC m=+4112.763457727" observedRunningTime="2026-03-21 06:00:08.18213534 +0000 UTC m=+4113.264718978" watchObservedRunningTime="2026-03-21 06:00:08.191634176 +0000 UTC m=+4113.274217814" Mar 21 06:00:08 crc kubenswrapper[4580]: I0321 06:00:08.545399 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567880-q9vcb" Mar 21 06:00:08 crc kubenswrapper[4580]: I0321 06:00:08.689098 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78q5q\" (UniqueName: \"kubernetes.io/projected/5b24ee55-d60e-4fc2-bb1e-2d42e45b8453-kube-api-access-78q5q\") pod \"5b24ee55-d60e-4fc2-bb1e-2d42e45b8453\" (UID: \"5b24ee55-d60e-4fc2-bb1e-2d42e45b8453\") " Mar 21 06:00:08 crc kubenswrapper[4580]: I0321 06:00:08.697990 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b24ee55-d60e-4fc2-bb1e-2d42e45b8453-kube-api-access-78q5q" (OuterVolumeSpecName: "kube-api-access-78q5q") pod "5b24ee55-d60e-4fc2-bb1e-2d42e45b8453" (UID: "5b24ee55-d60e-4fc2-bb1e-2d42e45b8453"). InnerVolumeSpecName "kube-api-access-78q5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:00:08 crc kubenswrapper[4580]: I0321 06:00:08.739742 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567874-nk5ld"] Mar 21 06:00:08 crc kubenswrapper[4580]: I0321 06:00:08.747723 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567874-nk5ld"] Mar 21 06:00:08 crc kubenswrapper[4580]: I0321 06:00:08.791369 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78q5q\" (UniqueName: \"kubernetes.io/projected/5b24ee55-d60e-4fc2-bb1e-2d42e45b8453-kube-api-access-78q5q\") on node \"crc\" DevicePath \"\"" Mar 21 06:00:09 crc kubenswrapper[4580]: I0321 06:00:09.174375 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567880-q9vcb" Mar 21 06:00:09 crc kubenswrapper[4580]: I0321 06:00:09.175875 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567880-q9vcb" event={"ID":"5b24ee55-d60e-4fc2-bb1e-2d42e45b8453","Type":"ContainerDied","Data":"07953a0d4b883fb0db5e8eb32fd5f693d54b05e3a9ace67e3ae34d262f811068"} Mar 21 06:00:09 crc kubenswrapper[4580]: I0321 06:00:09.175916 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07953a0d4b883fb0db5e8eb32fd5f693d54b05e3a9ace67e3ae34d262f811068" Mar 21 06:00:09 crc kubenswrapper[4580]: I0321 06:00:09.630124 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be3bba12-a082-489f-b9b2-d582337a8979" path="/var/lib/kubelet/pods/be3bba12-a082-489f-b9b2-d582337a8979/volumes" Mar 21 06:00:11 crc kubenswrapper[4580]: I0321 06:00:11.207627 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:11 crc kubenswrapper[4580]: I0321 06:00:11.207905 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:11 crc kubenswrapper[4580]: I0321 06:00:11.261574 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:12 crc kubenswrapper[4580]: I0321 06:00:12.267328 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:12 crc kubenswrapper[4580]: I0321 06:00:12.330402 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fvnl2"] Mar 21 06:00:14 crc kubenswrapper[4580]: I0321 06:00:14.230864 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fvnl2" podUID="aa658cfd-9979-4781-ac62-fa46dd78c706" containerName="registry-server" containerID="cri-o://980afae38324b3245c4542692b528e0a05cab0ba3085e373de75bdb737c42cb8" gracePeriod=2 Mar 21 06:00:14 crc kubenswrapper[4580]: I0321 06:00:14.766418 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:14 crc kubenswrapper[4580]: I0321 06:00:14.906831 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kmlz\" (UniqueName: \"kubernetes.io/projected/aa658cfd-9979-4781-ac62-fa46dd78c706-kube-api-access-6kmlz\") pod \"aa658cfd-9979-4781-ac62-fa46dd78c706\" (UID: \"aa658cfd-9979-4781-ac62-fa46dd78c706\") " Mar 21 06:00:14 crc kubenswrapper[4580]: I0321 06:00:14.907032 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa658cfd-9979-4781-ac62-fa46dd78c706-catalog-content\") pod \"aa658cfd-9979-4781-ac62-fa46dd78c706\" (UID: \"aa658cfd-9979-4781-ac62-fa46dd78c706\") " Mar 21 06:00:14 crc kubenswrapper[4580]: I0321 06:00:14.907119 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa658cfd-9979-4781-ac62-fa46dd78c706-utilities\") pod \"aa658cfd-9979-4781-ac62-fa46dd78c706\" (UID: \"aa658cfd-9979-4781-ac62-fa46dd78c706\") " Mar 21 06:00:14 crc kubenswrapper[4580]: I0321 06:00:14.908216 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa658cfd-9979-4781-ac62-fa46dd78c706-utilities" (OuterVolumeSpecName: "utilities") pod "aa658cfd-9979-4781-ac62-fa46dd78c706" (UID: "aa658cfd-9979-4781-ac62-fa46dd78c706"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 06:00:14 crc kubenswrapper[4580]: I0321 06:00:14.922023 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa658cfd-9979-4781-ac62-fa46dd78c706-kube-api-access-6kmlz" (OuterVolumeSpecName: "kube-api-access-6kmlz") pod "aa658cfd-9979-4781-ac62-fa46dd78c706" (UID: "aa658cfd-9979-4781-ac62-fa46dd78c706"). InnerVolumeSpecName "kube-api-access-6kmlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:00:14 crc kubenswrapper[4580]: I0321 06:00:14.990663 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa658cfd-9979-4781-ac62-fa46dd78c706-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa658cfd-9979-4781-ac62-fa46dd78c706" (UID: "aa658cfd-9979-4781-ac62-fa46dd78c706"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.017106 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kmlz\" (UniqueName: \"kubernetes.io/projected/aa658cfd-9979-4781-ac62-fa46dd78c706-kube-api-access-6kmlz\") on node \"crc\" DevicePath \"\"" Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.017150 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa658cfd-9979-4781-ac62-fa46dd78c706-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.017160 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa658cfd-9979-4781-ac62-fa46dd78c706-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.243705 4580 generic.go:334] "Generic (PLEG): container finished" podID="aa658cfd-9979-4781-ac62-fa46dd78c706" containerID="980afae38324b3245c4542692b528e0a05cab0ba3085e373de75bdb737c42cb8" exitCode=0 Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.243763 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvnl2" event={"ID":"aa658cfd-9979-4781-ac62-fa46dd78c706","Type":"ContainerDied","Data":"980afae38324b3245c4542692b528e0a05cab0ba3085e373de75bdb737c42cb8"} Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.243823 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvnl2" event={"ID":"aa658cfd-9979-4781-ac62-fa46dd78c706","Type":"ContainerDied","Data":"56d8e627b1424e92a7c64c3301bc3f5b438418efbf8691abb56b282467d58181"} Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.243857 4580 scope.go:117] "RemoveContainer" containerID="980afae38324b3245c4542692b528e0a05cab0ba3085e373de75bdb737c42cb8" Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.244018 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvnl2" Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.279833 4580 scope.go:117] "RemoveContainer" containerID="3bc36c57f2a909127dd0edd509465bc5e4ba1e750d14a53b3526ea9398e9cf19" Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.311480 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fvnl2"] Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.324832 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fvnl2"] Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.629665 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa658cfd-9979-4781-ac62-fa46dd78c706" path="/var/lib/kubelet/pods/aa658cfd-9979-4781-ac62-fa46dd78c706/volumes" Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.672036 4580 scope.go:117] "RemoveContainer" containerID="abc10d268e8d90085325f3bd308d215f806c913a2a2b960b51bf88ed2f32a6a5" Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.777062 4580 scope.go:117] "RemoveContainer" containerID="980afae38324b3245c4542692b528e0a05cab0ba3085e373de75bdb737c42cb8" Mar 21 06:00:15 crc kubenswrapper[4580]: E0321 06:00:15.778228 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"980afae38324b3245c4542692b528e0a05cab0ba3085e373de75bdb737c42cb8\": container with ID starting with 980afae38324b3245c4542692b528e0a05cab0ba3085e373de75bdb737c42cb8 not found: ID does not exist" containerID="980afae38324b3245c4542692b528e0a05cab0ba3085e373de75bdb737c42cb8" Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.778263 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"980afae38324b3245c4542692b528e0a05cab0ba3085e373de75bdb737c42cb8"} err="failed to get container status \"980afae38324b3245c4542692b528e0a05cab0ba3085e373de75bdb737c42cb8\": rpc error: code = NotFound desc = could not find container \"980afae38324b3245c4542692b528e0a05cab0ba3085e373de75bdb737c42cb8\": container with ID starting with 980afae38324b3245c4542692b528e0a05cab0ba3085e373de75bdb737c42cb8 not found: ID does not exist" Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.778285 4580 scope.go:117] "RemoveContainer" containerID="3bc36c57f2a909127dd0edd509465bc5e4ba1e750d14a53b3526ea9398e9cf19" Mar 21 06:00:15 crc kubenswrapper[4580]: E0321 06:00:15.778577 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bc36c57f2a909127dd0edd509465bc5e4ba1e750d14a53b3526ea9398e9cf19\": container with ID starting with 3bc36c57f2a909127dd0edd509465bc5e4ba1e750d14a53b3526ea9398e9cf19 not found: ID does not exist" containerID="3bc36c57f2a909127dd0edd509465bc5e4ba1e750d14a53b3526ea9398e9cf19" Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.778608 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bc36c57f2a909127dd0edd509465bc5e4ba1e750d14a53b3526ea9398e9cf19"} err="failed to get container status \"3bc36c57f2a909127dd0edd509465bc5e4ba1e750d14a53b3526ea9398e9cf19\": rpc error: code = NotFound desc = could not find container \"3bc36c57f2a909127dd0edd509465bc5e4ba1e750d14a53b3526ea9398e9cf19\": container with ID starting with 3bc36c57f2a909127dd0edd509465bc5e4ba1e750d14a53b3526ea9398e9cf19 not found: ID does not exist" Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.778624 4580 scope.go:117] "RemoveContainer" containerID="abc10d268e8d90085325f3bd308d215f806c913a2a2b960b51bf88ed2f32a6a5" Mar 21 06:00:15 crc kubenswrapper[4580]: E0321 06:00:15.779015 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc10d268e8d90085325f3bd308d215f806c913a2a2b960b51bf88ed2f32a6a5\": container with ID starting with abc10d268e8d90085325f3bd308d215f806c913a2a2b960b51bf88ed2f32a6a5 not found: ID does not exist" containerID="abc10d268e8d90085325f3bd308d215f806c913a2a2b960b51bf88ed2f32a6a5" Mar 21 06:00:15 crc kubenswrapper[4580]: I0321 06:00:15.779038 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc10d268e8d90085325f3bd308d215f806c913a2a2b960b51bf88ed2f32a6a5"} err="failed to get container status \"abc10d268e8d90085325f3bd308d215f806c913a2a2b960b51bf88ed2f32a6a5\": rpc error: code = NotFound desc = could not find container \"abc10d268e8d90085325f3bd308d215f806c913a2a2b960b51bf88ed2f32a6a5\": container with ID starting with abc10d268e8d90085325f3bd308d215f806c913a2a2b960b51bf88ed2f32a6a5 not found: ID does not exist" Mar 21 06:00:16 crc kubenswrapper[4580]: I0321 06:00:16.151829 4580 scope.go:117] "RemoveContainer" containerID="87faedf35ac6f3b2468e9c18e4131308b29fbe42705baf3e14f1863e821fcd21" Mar 21 06:00:16 crc kubenswrapper[4580]: I0321 06:00:16.181470 4580 scope.go:117] "RemoveContainer" containerID="b339f9a642b036511c49aa5f4066e01413343ff3a677569945aac32589a46d19" Mar 21 06:00:18 crc kubenswrapper[4580]: I0321 06:00:18.618103 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 06:00:18 crc kubenswrapper[4580]: E0321 06:00:18.618833 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 06:00:32 crc kubenswrapper[4580]: I0321 06:00:32.618357 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 06:00:32 crc kubenswrapper[4580]: E0321 06:00:32.619205 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 06:00:34 crc kubenswrapper[4580]: E0321 06:00:34.325467 4580 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.138:47336->38.102.83.138:40205: write tcp 38.102.83.138:47336->38.102.83.138:40205: write: broken pipe Mar 21 06:00:46 crc kubenswrapper[4580]: I0321 06:00:46.618569 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 06:00:46 crc kubenswrapper[4580]: E0321 06:00:46.619497 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 06:00:57 crc kubenswrapper[4580]: I0321 06:00:57.622365 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 06:00:57 crc kubenswrapper[4580]: E0321 06:00:57.623123 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.168509 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29567881-dmh67"] Mar 21 06:01:00 crc kubenswrapper[4580]: E0321 06:01:00.169191 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b24ee55-d60e-4fc2-bb1e-2d42e45b8453" containerName="oc" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.169206 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b24ee55-d60e-4fc2-bb1e-2d42e45b8453" containerName="oc" Mar 21 06:01:00 crc kubenswrapper[4580]: E0321 06:01:00.169230 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cf79a68-1b07-490d-952e-523766e34e82" containerName="collect-profiles" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.169239 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cf79a68-1b07-490d-952e-523766e34e82" containerName="collect-profiles" Mar 21 06:01:00 crc kubenswrapper[4580]: E0321 06:01:00.169261 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa658cfd-9979-4781-ac62-fa46dd78c706" containerName="registry-server" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.169269 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa658cfd-9979-4781-ac62-fa46dd78c706" containerName="registry-server" Mar 21 06:01:00 crc kubenswrapper[4580]: E0321 06:01:00.169298 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa658cfd-9979-4781-ac62-fa46dd78c706" containerName="extract-utilities" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.169305 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa658cfd-9979-4781-ac62-fa46dd78c706" containerName="extract-utilities" Mar 21 06:01:00 crc kubenswrapper[4580]: E0321 06:01:00.169318 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa658cfd-9979-4781-ac62-fa46dd78c706" containerName="extract-content" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.169326 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa658cfd-9979-4781-ac62-fa46dd78c706" containerName="extract-content" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.169533 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cf79a68-1b07-490d-952e-523766e34e82" containerName="collect-profiles" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.169549 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b24ee55-d60e-4fc2-bb1e-2d42e45b8453" containerName="oc" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.169568 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa658cfd-9979-4781-ac62-fa46dd78c706" containerName="registry-server" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.170313 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.201846 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567881-dmh67"] Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.319124 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-config-data\") pod \"keystone-cron-29567881-dmh67\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.319170 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79vhd\" (UniqueName: \"kubernetes.io/projected/f35d84d6-558d-4dca-8f4a-0dc3c595b738-kube-api-access-79vhd\") pod \"keystone-cron-29567881-dmh67\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.319284 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-combined-ca-bundle\") pod \"keystone-cron-29567881-dmh67\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.319300 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-fernet-keys\") pod \"keystone-cron-29567881-dmh67\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.421519 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-combined-ca-bundle\") pod \"keystone-cron-29567881-dmh67\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.421568 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-fernet-keys\") pod \"keystone-cron-29567881-dmh67\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.421641 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-config-data\") pod \"keystone-cron-29567881-dmh67\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.421663 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79vhd\" (UniqueName: \"kubernetes.io/projected/f35d84d6-558d-4dca-8f4a-0dc3c595b738-kube-api-access-79vhd\") pod \"keystone-cron-29567881-dmh67\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.428532 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-fernet-keys\") pod \"keystone-cron-29567881-dmh67\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.449497 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-combined-ca-bundle\") pod \"keystone-cron-29567881-dmh67\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.452586 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-config-data\") pod \"keystone-cron-29567881-dmh67\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.458352 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79vhd\" (UniqueName: \"kubernetes.io/projected/f35d84d6-558d-4dca-8f4a-0dc3c595b738-kube-api-access-79vhd\") pod \"keystone-cron-29567881-dmh67\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:00 crc kubenswrapper[4580]: I0321 06:01:00.500713 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:01 crc kubenswrapper[4580]: I0321 06:01:01.118763 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567881-dmh67"] Mar 21 06:01:01 crc kubenswrapper[4580]: I0321 06:01:01.717084 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567881-dmh67" event={"ID":"f35d84d6-558d-4dca-8f4a-0dc3c595b738","Type":"ContainerStarted","Data":"b896dcdac2986991702353b5e3f2e21f654a9508c64fcae11cdd9ffff93919df"} Mar 21 06:01:01 crc kubenswrapper[4580]: I0321 06:01:01.717480 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567881-dmh67" event={"ID":"f35d84d6-558d-4dca-8f4a-0dc3c595b738","Type":"ContainerStarted","Data":"6e641a80f9c5a98404026d124f36d7f5b0729e192dd0fc8538ee0ab9bbde5649"} Mar 21 06:01:01 crc kubenswrapper[4580]: I0321 06:01:01.736542 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29567881-dmh67" podStartSLOduration=1.7365171080000001 podStartE2EDuration="1.736517108s" podCreationTimestamp="2026-03-21 06:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 06:01:01.735406448 +0000 UTC m=+4166.817990136" watchObservedRunningTime="2026-03-21 06:01:01.736517108 +0000 UTC m=+4166.819100756" Mar 21 06:01:04 crc kubenswrapper[4580]: I0321 06:01:04.742167 4580 generic.go:334] "Generic (PLEG): container finished" podID="f35d84d6-558d-4dca-8f4a-0dc3c595b738" containerID="b896dcdac2986991702353b5e3f2e21f654a9508c64fcae11cdd9ffff93919df" exitCode=0 Mar 21 06:01:04 crc kubenswrapper[4580]: I0321 06:01:04.742370 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567881-dmh67" event={"ID":"f35d84d6-558d-4dca-8f4a-0dc3c595b738","Type":"ContainerDied","Data":"b896dcdac2986991702353b5e3f2e21f654a9508c64fcae11cdd9ffff93919df"} Mar 21 06:01:04 crc kubenswrapper[4580]: I0321 06:01:04.812019 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rcndd"] Mar 21 06:01:04 crc kubenswrapper[4580]: I0321 06:01:04.815286 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:04 crc kubenswrapper[4580]: I0321 06:01:04.824130 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rcndd"] Mar 21 06:01:04 crc kubenswrapper[4580]: I0321 06:01:04.913089 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrnjh\" (UniqueName: \"kubernetes.io/projected/78516a4f-299a-4ea7-a204-fec240217bd9-kube-api-access-wrnjh\") pod \"certified-operators-rcndd\" (UID: \"78516a4f-299a-4ea7-a204-fec240217bd9\") " pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:04 crc kubenswrapper[4580]: I0321 06:01:04.913143 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78516a4f-299a-4ea7-a204-fec240217bd9-catalog-content\") pod \"certified-operators-rcndd\" (UID: \"78516a4f-299a-4ea7-a204-fec240217bd9\") " pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:04 crc kubenswrapper[4580]: I0321 06:01:04.913229 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78516a4f-299a-4ea7-a204-fec240217bd9-utilities\") pod \"certified-operators-rcndd\" (UID: \"78516a4f-299a-4ea7-a204-fec240217bd9\") " pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:05 crc kubenswrapper[4580]: I0321 06:01:05.015626 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78516a4f-299a-4ea7-a204-fec240217bd9-utilities\") pod \"certified-operators-rcndd\" (UID: \"78516a4f-299a-4ea7-a204-fec240217bd9\") " pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:05 crc kubenswrapper[4580]: I0321 06:01:05.016097 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrnjh\" (UniqueName: \"kubernetes.io/projected/78516a4f-299a-4ea7-a204-fec240217bd9-kube-api-access-wrnjh\") pod \"certified-operators-rcndd\" (UID: \"78516a4f-299a-4ea7-a204-fec240217bd9\") " pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:05 crc kubenswrapper[4580]: I0321 06:01:05.016150 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78516a4f-299a-4ea7-a204-fec240217bd9-utilities\") pod \"certified-operators-rcndd\" (UID: \"78516a4f-299a-4ea7-a204-fec240217bd9\") " pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:05 crc kubenswrapper[4580]: I0321 06:01:05.016227 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78516a4f-299a-4ea7-a204-fec240217bd9-catalog-content\") pod \"certified-operators-rcndd\" (UID: \"78516a4f-299a-4ea7-a204-fec240217bd9\") " pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:05 crc kubenswrapper[4580]: I0321 06:01:05.016525 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78516a4f-299a-4ea7-a204-fec240217bd9-catalog-content\") pod \"certified-operators-rcndd\" (UID: \"78516a4f-299a-4ea7-a204-fec240217bd9\") " pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:05 crc kubenswrapper[4580]: I0321 06:01:05.050134 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrnjh\" (UniqueName: \"kubernetes.io/projected/78516a4f-299a-4ea7-a204-fec240217bd9-kube-api-access-wrnjh\") pod \"certified-operators-rcndd\" (UID: \"78516a4f-299a-4ea7-a204-fec240217bd9\") " pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:05 crc kubenswrapper[4580]: I0321 06:01:05.149485 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:05 crc kubenswrapper[4580]: I0321 06:01:05.680649 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rcndd"] Mar 21 06:01:05 crc kubenswrapper[4580]: I0321 06:01:05.759997 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcndd" event={"ID":"78516a4f-299a-4ea7-a204-fec240217bd9","Type":"ContainerStarted","Data":"51b6c429e571101b4b6af39ee096955c653cf44cf8fe01677d295eb873632a4b"} Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.194472 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.239032 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-config-data\") pod \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.239192 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79vhd\" (UniqueName: \"kubernetes.io/projected/f35d84d6-558d-4dca-8f4a-0dc3c595b738-kube-api-access-79vhd\") pod \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.239248 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-fernet-keys\") pod \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.239281 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-combined-ca-bundle\") pod \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\" (UID: \"f35d84d6-558d-4dca-8f4a-0dc3c595b738\") " Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.250944 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f35d84d6-558d-4dca-8f4a-0dc3c595b738-kube-api-access-79vhd" (OuterVolumeSpecName: "kube-api-access-79vhd") pod "f35d84d6-558d-4dca-8f4a-0dc3c595b738" (UID: "f35d84d6-558d-4dca-8f4a-0dc3c595b738"). InnerVolumeSpecName "kube-api-access-79vhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.252676 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f35d84d6-558d-4dca-8f4a-0dc3c595b738" (UID: "f35d84d6-558d-4dca-8f4a-0dc3c595b738"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.336018 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f35d84d6-558d-4dca-8f4a-0dc3c595b738" (UID: "f35d84d6-558d-4dca-8f4a-0dc3c595b738"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.341580 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79vhd\" (UniqueName: \"kubernetes.io/projected/f35d84d6-558d-4dca-8f4a-0dc3c595b738-kube-api-access-79vhd\") on node \"crc\" DevicePath \"\"" Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.341610 4580 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.341619 4580 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.406881 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-config-data" (OuterVolumeSpecName: "config-data") pod "f35d84d6-558d-4dca-8f4a-0dc3c595b738" (UID: "f35d84d6-558d-4dca-8f4a-0dc3c595b738"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.443944 4580 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f35d84d6-558d-4dca-8f4a-0dc3c595b738-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.770064 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567881-dmh67" event={"ID":"f35d84d6-558d-4dca-8f4a-0dc3c595b738","Type":"ContainerDied","Data":"6e641a80f9c5a98404026d124f36d7f5b0729e192dd0fc8538ee0ab9bbde5649"} Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.770108 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e641a80f9c5a98404026d124f36d7f5b0729e192dd0fc8538ee0ab9bbde5649" Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.770161 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567881-dmh67" Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.785296 4580 generic.go:334] "Generic (PLEG): container finished" podID="78516a4f-299a-4ea7-a204-fec240217bd9" containerID="8ad3f814388b7836bcacd709151076c013b6a6e64c59a708e3176357848258d5" exitCode=0 Mar 21 06:01:06 crc kubenswrapper[4580]: I0321 06:01:06.785343 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcndd" event={"ID":"78516a4f-299a-4ea7-a204-fec240217bd9","Type":"ContainerDied","Data":"8ad3f814388b7836bcacd709151076c013b6a6e64c59a708e3176357848258d5"} Mar 21 06:01:07 crc kubenswrapper[4580]: I0321 06:01:07.799274 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcndd" event={"ID":"78516a4f-299a-4ea7-a204-fec240217bd9","Type":"ContainerStarted","Data":"c4424ca28160c9c52b5f468a3dc4c22ec432e48d22bb9a22368381e4aebca85d"} Mar 21 06:01:09 crc kubenswrapper[4580]: I0321 06:01:09.620265 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 06:01:09 crc kubenswrapper[4580]: E0321 06:01:09.620885 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 06:01:09 crc kubenswrapper[4580]: I0321 06:01:09.824774 4580 generic.go:334] "Generic (PLEG): container finished" podID="78516a4f-299a-4ea7-a204-fec240217bd9" containerID="c4424ca28160c9c52b5f468a3dc4c22ec432e48d22bb9a22368381e4aebca85d" exitCode=0 Mar 21 06:01:09 crc kubenswrapper[4580]: I0321 06:01:09.825903 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcndd" event={"ID":"78516a4f-299a-4ea7-a204-fec240217bd9","Type":"ContainerDied","Data":"c4424ca28160c9c52b5f468a3dc4c22ec432e48d22bb9a22368381e4aebca85d"} Mar 21 06:01:10 crc kubenswrapper[4580]: I0321 06:01:10.836058 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcndd" event={"ID":"78516a4f-299a-4ea7-a204-fec240217bd9","Type":"ContainerStarted","Data":"c4ce5bcc3a8c701053d1626a94101f286a93c43e3a1ffb5217ccaee86c63423e"} Mar 21 06:01:10 crc kubenswrapper[4580]: I0321 06:01:10.870861 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rcndd" podStartSLOduration=3.111193852 podStartE2EDuration="6.870834502s" podCreationTimestamp="2026-03-21 06:01:04 +0000 UTC" firstStartedPulling="2026-03-21 06:01:06.787662051 +0000 UTC m=+4171.870245679" lastFinishedPulling="2026-03-21 06:01:10.547302701 +0000 UTC m=+4175.629886329" observedRunningTime="2026-03-21 06:01:10.862263471 +0000 UTC m=+4175.944847109" watchObservedRunningTime="2026-03-21 06:01:10.870834502 +0000 UTC m=+4175.953418130" Mar 21 06:01:15 crc kubenswrapper[4580]: I0321 06:01:15.149734 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:15 crc kubenswrapper[4580]: I0321 06:01:15.150492 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:15 crc kubenswrapper[4580]: I0321 06:01:15.203250 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:16 crc kubenswrapper[4580]: I0321 06:01:16.401450 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:16 crc kubenswrapper[4580]: I0321 06:01:16.450263 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rcndd"] Mar 21 06:01:17 crc kubenswrapper[4580]: I0321 06:01:17.892299 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rcndd" podUID="78516a4f-299a-4ea7-a204-fec240217bd9" containerName="registry-server" containerID="cri-o://c4ce5bcc3a8c701053d1626a94101f286a93c43e3a1ffb5217ccaee86c63423e" gracePeriod=2 Mar 21 06:01:18 crc kubenswrapper[4580]: I0321 06:01:18.904033 4580 generic.go:334] "Generic (PLEG): container finished" podID="78516a4f-299a-4ea7-a204-fec240217bd9" containerID="c4ce5bcc3a8c701053d1626a94101f286a93c43e3a1ffb5217ccaee86c63423e" exitCode=0 Mar 21 06:01:18 crc kubenswrapper[4580]: I0321 06:01:18.904095 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcndd" event={"ID":"78516a4f-299a-4ea7-a204-fec240217bd9","Type":"ContainerDied","Data":"c4ce5bcc3a8c701053d1626a94101f286a93c43e3a1ffb5217ccaee86c63423e"} Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.023943 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.224926 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78516a4f-299a-4ea7-a204-fec240217bd9-utilities\") pod \"78516a4f-299a-4ea7-a204-fec240217bd9\" (UID: \"78516a4f-299a-4ea7-a204-fec240217bd9\") " Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.224978 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrnjh\" (UniqueName: \"kubernetes.io/projected/78516a4f-299a-4ea7-a204-fec240217bd9-kube-api-access-wrnjh\") pod \"78516a4f-299a-4ea7-a204-fec240217bd9\" (UID: \"78516a4f-299a-4ea7-a204-fec240217bd9\") " Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.225070 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78516a4f-299a-4ea7-a204-fec240217bd9-catalog-content\") pod \"78516a4f-299a-4ea7-a204-fec240217bd9\" (UID: \"78516a4f-299a-4ea7-a204-fec240217bd9\") " Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.225922 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78516a4f-299a-4ea7-a204-fec240217bd9-utilities" (OuterVolumeSpecName: "utilities") pod "78516a4f-299a-4ea7-a204-fec240217bd9" (UID: "78516a4f-299a-4ea7-a204-fec240217bd9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.231656 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78516a4f-299a-4ea7-a204-fec240217bd9-kube-api-access-wrnjh" (OuterVolumeSpecName: "kube-api-access-wrnjh") pod "78516a4f-299a-4ea7-a204-fec240217bd9" (UID: "78516a4f-299a-4ea7-a204-fec240217bd9"). InnerVolumeSpecName "kube-api-access-wrnjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.285281 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78516a4f-299a-4ea7-a204-fec240217bd9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78516a4f-299a-4ea7-a204-fec240217bd9" (UID: "78516a4f-299a-4ea7-a204-fec240217bd9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.327045 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78516a4f-299a-4ea7-a204-fec240217bd9-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.327084 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrnjh\" (UniqueName: \"kubernetes.io/projected/78516a4f-299a-4ea7-a204-fec240217bd9-kube-api-access-wrnjh\") on node \"crc\" DevicePath \"\"" Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.327094 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78516a4f-299a-4ea7-a204-fec240217bd9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.913526 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcndd" event={"ID":"78516a4f-299a-4ea7-a204-fec240217bd9","Type":"ContainerDied","Data":"51b6c429e571101b4b6af39ee096955c653cf44cf8fe01677d295eb873632a4b"} Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.913579 4580 scope.go:117] "RemoveContainer" containerID="c4ce5bcc3a8c701053d1626a94101f286a93c43e3a1ffb5217ccaee86c63423e" Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.913613 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcndd" Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.942407 4580 scope.go:117] "RemoveContainer" containerID="c4424ca28160c9c52b5f468a3dc4c22ec432e48d22bb9a22368381e4aebca85d" Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.949712 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rcndd"] Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.958506 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rcndd"] Mar 21 06:01:19 crc kubenswrapper[4580]: I0321 06:01:19.973706 4580 scope.go:117] "RemoveContainer" containerID="8ad3f814388b7836bcacd709151076c013b6a6e64c59a708e3176357848258d5" Mar 21 06:01:21 crc kubenswrapper[4580]: I0321 06:01:21.618423 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 06:01:21 crc kubenswrapper[4580]: E0321 06:01:21.618976 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 06:01:21 crc kubenswrapper[4580]: I0321 06:01:21.630571 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78516a4f-299a-4ea7-a204-fec240217bd9" path="/var/lib/kubelet/pods/78516a4f-299a-4ea7-a204-fec240217bd9/volumes" Mar 21 06:01:35 crc kubenswrapper[4580]: I0321 06:01:35.646538 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 06:01:35 crc kubenswrapper[4580]: E0321 06:01:35.648440 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.040766 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mrdvx"] Mar 21 06:01:45 crc kubenswrapper[4580]: E0321 06:01:45.041968 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f35d84d6-558d-4dca-8f4a-0dc3c595b738" containerName="keystone-cron" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.041985 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f35d84d6-558d-4dca-8f4a-0dc3c595b738" containerName="keystone-cron" Mar 21 06:01:45 crc kubenswrapper[4580]: E0321 06:01:45.042014 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78516a4f-299a-4ea7-a204-fec240217bd9" containerName="extract-content" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.042022 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="78516a4f-299a-4ea7-a204-fec240217bd9" containerName="extract-content" Mar 21 06:01:45 crc kubenswrapper[4580]: E0321 06:01:45.042033 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78516a4f-299a-4ea7-a204-fec240217bd9" containerName="registry-server" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.042041 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="78516a4f-299a-4ea7-a204-fec240217bd9" containerName="registry-server" Mar 21 06:01:45 crc kubenswrapper[4580]: E0321 06:01:45.042056 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78516a4f-299a-4ea7-a204-fec240217bd9" containerName="extract-utilities" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.042063 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="78516a4f-299a-4ea7-a204-fec240217bd9" containerName="extract-utilities" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.042294 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="78516a4f-299a-4ea7-a204-fec240217bd9" containerName="registry-server" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.042321 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="f35d84d6-558d-4dca-8f4a-0dc3c595b738" containerName="keystone-cron" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.049136 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.060227 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mrdvx"] Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.179374 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a74b4ec-79b2-4983-9cd9-73e0436be439-catalog-content\") pod \"redhat-marketplace-mrdvx\" (UID: \"3a74b4ec-79b2-4983-9cd9-73e0436be439\") " pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.179554 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a74b4ec-79b2-4983-9cd9-73e0436be439-utilities\") pod \"redhat-marketplace-mrdvx\" (UID: \"3a74b4ec-79b2-4983-9cd9-73e0436be439\") " pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.179705 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55x6w\" (UniqueName: \"kubernetes.io/projected/3a74b4ec-79b2-4983-9cd9-73e0436be439-kube-api-access-55x6w\") pod \"redhat-marketplace-mrdvx\" (UID: \"3a74b4ec-79b2-4983-9cd9-73e0436be439\") " pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.281230 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a74b4ec-79b2-4983-9cd9-73e0436be439-utilities\") pod \"redhat-marketplace-mrdvx\" (UID: \"3a74b4ec-79b2-4983-9cd9-73e0436be439\") " pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.281385 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55x6w\" (UniqueName: \"kubernetes.io/projected/3a74b4ec-79b2-4983-9cd9-73e0436be439-kube-api-access-55x6w\") pod \"redhat-marketplace-mrdvx\" (UID: \"3a74b4ec-79b2-4983-9cd9-73e0436be439\") " pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.281834 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a74b4ec-79b2-4983-9cd9-73e0436be439-catalog-content\") pod \"redhat-marketplace-mrdvx\" (UID: \"3a74b4ec-79b2-4983-9cd9-73e0436be439\") " pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.281874 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a74b4ec-79b2-4983-9cd9-73e0436be439-utilities\") pod \"redhat-marketplace-mrdvx\" (UID: \"3a74b4ec-79b2-4983-9cd9-73e0436be439\") " pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.282230 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a74b4ec-79b2-4983-9cd9-73e0436be439-catalog-content\") pod \"redhat-marketplace-mrdvx\" (UID: \"3a74b4ec-79b2-4983-9cd9-73e0436be439\") " pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.306123 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55x6w\" (UniqueName: \"kubernetes.io/projected/3a74b4ec-79b2-4983-9cd9-73e0436be439-kube-api-access-55x6w\") pod \"redhat-marketplace-mrdvx\" (UID: \"3a74b4ec-79b2-4983-9cd9-73e0436be439\") " pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.379727 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:45 crc kubenswrapper[4580]: I0321 06:01:45.856956 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mrdvx"] Mar 21 06:01:46 crc kubenswrapper[4580]: I0321 06:01:46.176393 4580 generic.go:334] "Generic (PLEG): container finished" podID="3a74b4ec-79b2-4983-9cd9-73e0436be439" containerID="cda53be9c2749fec50e22c2820824a9f3408b160eb9e4d64e920d1ba8b635e41" exitCode=0 Mar 21 06:01:46 crc kubenswrapper[4580]: I0321 06:01:46.177584 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrdvx" event={"ID":"3a74b4ec-79b2-4983-9cd9-73e0436be439","Type":"ContainerDied","Data":"cda53be9c2749fec50e22c2820824a9f3408b160eb9e4d64e920d1ba8b635e41"} Mar 21 06:01:46 crc kubenswrapper[4580]: I0321 06:01:46.177672 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrdvx" event={"ID":"3a74b4ec-79b2-4983-9cd9-73e0436be439","Type":"ContainerStarted","Data":"ae40de0c410cb500bccff3b2102557defc937c28fb72120df76f654f79aa4426"} Mar 21 06:01:47 crc kubenswrapper[4580]: I0321 06:01:47.618005 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 06:01:47 crc kubenswrapper[4580]: E0321 06:01:47.618851 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 06:01:48 crc kubenswrapper[4580]: I0321 06:01:48.198513 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrdvx" event={"ID":"3a74b4ec-79b2-4983-9cd9-73e0436be439","Type":"ContainerStarted","Data":"1459a7bb200bb286c6365fff2a84e9cc26dbf1293dc480cf8e0dc9afa834cc4c"} Mar 21 06:01:48 crc kubenswrapper[4580]: E0321 06:01:48.933310 4580 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a74b4ec_79b2_4983_9cd9_73e0436be439.slice/crio-1459a7bb200bb286c6365fff2a84e9cc26dbf1293dc480cf8e0dc9afa834cc4c.scope\": RecentStats: unable to find data in memory cache]" Mar 21 06:01:49 crc kubenswrapper[4580]: I0321 06:01:49.212750 4580 generic.go:334] "Generic (PLEG): container finished" podID="3a74b4ec-79b2-4983-9cd9-73e0436be439" containerID="1459a7bb200bb286c6365fff2a84e9cc26dbf1293dc480cf8e0dc9afa834cc4c" exitCode=0 Mar 21 06:01:49 crc kubenswrapper[4580]: I0321 06:01:49.212820 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrdvx" event={"ID":"3a74b4ec-79b2-4983-9cd9-73e0436be439","Type":"ContainerDied","Data":"1459a7bb200bb286c6365fff2a84e9cc26dbf1293dc480cf8e0dc9afa834cc4c"} Mar 21 06:01:50 crc kubenswrapper[4580]: I0321 06:01:50.224298 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrdvx" event={"ID":"3a74b4ec-79b2-4983-9cd9-73e0436be439","Type":"ContainerStarted","Data":"b9dfb2abe1410f5bc0d160db5a6c08a701ef0aa33c3a85cbbcbb5c06381e856c"} Mar 21 06:01:50 crc kubenswrapper[4580]: I0321 06:01:50.255284 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mrdvx" podStartSLOduration=1.696552506 podStartE2EDuration="5.255264391s" podCreationTimestamp="2026-03-21 06:01:45 +0000 UTC" firstStartedPulling="2026-03-21 06:01:46.178244526 +0000 UTC m=+4211.260828154" lastFinishedPulling="2026-03-21 06:01:49.736956411 +0000 UTC m=+4214.819540039" observedRunningTime="2026-03-21 06:01:50.245659232 +0000 UTC m=+4215.328242860" watchObservedRunningTime="2026-03-21 06:01:50.255264391 +0000 UTC m=+4215.337848019" Mar 21 06:01:55 crc kubenswrapper[4580]: I0321 06:01:55.381083 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:55 crc kubenswrapper[4580]: I0321 06:01:55.381703 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:55 crc kubenswrapper[4580]: I0321 06:01:55.445401 4580 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:56 crc kubenswrapper[4580]: I0321 06:01:56.592728 4580 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:56 crc kubenswrapper[4580]: I0321 06:01:56.641726 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mrdvx"] Mar 21 06:01:58 crc kubenswrapper[4580]: I0321 06:01:58.290646 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mrdvx" podUID="3a74b4ec-79b2-4983-9cd9-73e0436be439" containerName="registry-server" containerID="cri-o://b9dfb2abe1410f5bc0d160db5a6c08a701ef0aa33c3a85cbbcbb5c06381e856c" gracePeriod=2 Mar 21 06:01:58 crc kubenswrapper[4580]: I0321 06:01:58.820910 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:58 crc kubenswrapper[4580]: I0321 06:01:58.978910 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a74b4ec-79b2-4983-9cd9-73e0436be439-utilities\") pod \"3a74b4ec-79b2-4983-9cd9-73e0436be439\" (UID: \"3a74b4ec-79b2-4983-9cd9-73e0436be439\") " Mar 21 06:01:58 crc kubenswrapper[4580]: I0321 06:01:58.979211 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a74b4ec-79b2-4983-9cd9-73e0436be439-catalog-content\") pod \"3a74b4ec-79b2-4983-9cd9-73e0436be439\" (UID: \"3a74b4ec-79b2-4983-9cd9-73e0436be439\") " Mar 21 06:01:58 crc kubenswrapper[4580]: I0321 06:01:58.979255 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55x6w\" (UniqueName: \"kubernetes.io/projected/3a74b4ec-79b2-4983-9cd9-73e0436be439-kube-api-access-55x6w\") pod \"3a74b4ec-79b2-4983-9cd9-73e0436be439\" (UID: \"3a74b4ec-79b2-4983-9cd9-73e0436be439\") " Mar 21 06:01:58 crc kubenswrapper[4580]: I0321 06:01:58.979948 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a74b4ec-79b2-4983-9cd9-73e0436be439-utilities" (OuterVolumeSpecName: "utilities") pod "3a74b4ec-79b2-4983-9cd9-73e0436be439" (UID: "3a74b4ec-79b2-4983-9cd9-73e0436be439"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 06:01:58 crc kubenswrapper[4580]: I0321 06:01:58.986091 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a74b4ec-79b2-4983-9cd9-73e0436be439-kube-api-access-55x6w" (OuterVolumeSpecName: "kube-api-access-55x6w") pod "3a74b4ec-79b2-4983-9cd9-73e0436be439" (UID: "3a74b4ec-79b2-4983-9cd9-73e0436be439"). InnerVolumeSpecName "kube-api-access-55x6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.006986 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a74b4ec-79b2-4983-9cd9-73e0436be439-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a74b4ec-79b2-4983-9cd9-73e0436be439" (UID: "3a74b4ec-79b2-4983-9cd9-73e0436be439"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.083085 4580 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a74b4ec-79b2-4983-9cd9-73e0436be439-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.086504 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55x6w\" (UniqueName: \"kubernetes.io/projected/3a74b4ec-79b2-4983-9cd9-73e0436be439-kube-api-access-55x6w\") on node \"crc\" DevicePath \"\"" Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.086656 4580 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a74b4ec-79b2-4983-9cd9-73e0436be439-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.301624 4580 generic.go:334] "Generic (PLEG): container finished" podID="3a74b4ec-79b2-4983-9cd9-73e0436be439" containerID="b9dfb2abe1410f5bc0d160db5a6c08a701ef0aa33c3a85cbbcbb5c06381e856c" exitCode=0 Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.301676 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrdvx" event={"ID":"3a74b4ec-79b2-4983-9cd9-73e0436be439","Type":"ContainerDied","Data":"b9dfb2abe1410f5bc0d160db5a6c08a701ef0aa33c3a85cbbcbb5c06381e856c"} Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.301695 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mrdvx" Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.301713 4580 scope.go:117] "RemoveContainer" containerID="b9dfb2abe1410f5bc0d160db5a6c08a701ef0aa33c3a85cbbcbb5c06381e856c" Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.301702 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrdvx" event={"ID":"3a74b4ec-79b2-4983-9cd9-73e0436be439","Type":"ContainerDied","Data":"ae40de0c410cb500bccff3b2102557defc937c28fb72120df76f654f79aa4426"} Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.334222 4580 scope.go:117] "RemoveContainer" containerID="1459a7bb200bb286c6365fff2a84e9cc26dbf1293dc480cf8e0dc9afa834cc4c" Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.347255 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mrdvx"] Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.355047 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mrdvx"] Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.364587 4580 scope.go:117] "RemoveContainer" containerID="cda53be9c2749fec50e22c2820824a9f3408b160eb9e4d64e920d1ba8b635e41" Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.395353 4580 scope.go:117] "RemoveContainer" containerID="b9dfb2abe1410f5bc0d160db5a6c08a701ef0aa33c3a85cbbcbb5c06381e856c" Mar 21 06:01:59 crc kubenswrapper[4580]: E0321 06:01:59.396126 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9dfb2abe1410f5bc0d160db5a6c08a701ef0aa33c3a85cbbcbb5c06381e856c\": container with ID starting with b9dfb2abe1410f5bc0d160db5a6c08a701ef0aa33c3a85cbbcbb5c06381e856c not found: ID does not exist" containerID="b9dfb2abe1410f5bc0d160db5a6c08a701ef0aa33c3a85cbbcbb5c06381e856c" Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.396166 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9dfb2abe1410f5bc0d160db5a6c08a701ef0aa33c3a85cbbcbb5c06381e856c"} err="failed to get container status \"b9dfb2abe1410f5bc0d160db5a6c08a701ef0aa33c3a85cbbcbb5c06381e856c\": rpc error: code = NotFound desc = could not find container \"b9dfb2abe1410f5bc0d160db5a6c08a701ef0aa33c3a85cbbcbb5c06381e856c\": container with ID starting with b9dfb2abe1410f5bc0d160db5a6c08a701ef0aa33c3a85cbbcbb5c06381e856c not found: ID does not exist" Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.396230 4580 scope.go:117] "RemoveContainer" containerID="1459a7bb200bb286c6365fff2a84e9cc26dbf1293dc480cf8e0dc9afa834cc4c" Mar 21 06:01:59 crc kubenswrapper[4580]: E0321 06:01:59.396552 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1459a7bb200bb286c6365fff2a84e9cc26dbf1293dc480cf8e0dc9afa834cc4c\": container with ID starting with 1459a7bb200bb286c6365fff2a84e9cc26dbf1293dc480cf8e0dc9afa834cc4c not found: ID does not exist" containerID="1459a7bb200bb286c6365fff2a84e9cc26dbf1293dc480cf8e0dc9afa834cc4c" Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.396577 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1459a7bb200bb286c6365fff2a84e9cc26dbf1293dc480cf8e0dc9afa834cc4c"} err="failed to get container status \"1459a7bb200bb286c6365fff2a84e9cc26dbf1293dc480cf8e0dc9afa834cc4c\": rpc error: code = NotFound desc = could not find container \"1459a7bb200bb286c6365fff2a84e9cc26dbf1293dc480cf8e0dc9afa834cc4c\": container with ID starting with 1459a7bb200bb286c6365fff2a84e9cc26dbf1293dc480cf8e0dc9afa834cc4c not found: ID does not exist" Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.396605 4580 scope.go:117] "RemoveContainer" containerID="cda53be9c2749fec50e22c2820824a9f3408b160eb9e4d64e920d1ba8b635e41" Mar 21 06:01:59 crc kubenswrapper[4580]: E0321 06:01:59.396842 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda53be9c2749fec50e22c2820824a9f3408b160eb9e4d64e920d1ba8b635e41\": container with ID starting with cda53be9c2749fec50e22c2820824a9f3408b160eb9e4d64e920d1ba8b635e41 not found: ID does not exist" containerID="cda53be9c2749fec50e22c2820824a9f3408b160eb9e4d64e920d1ba8b635e41" Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.396868 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda53be9c2749fec50e22c2820824a9f3408b160eb9e4d64e920d1ba8b635e41"} err="failed to get container status \"cda53be9c2749fec50e22c2820824a9f3408b160eb9e4d64e920d1ba8b635e41\": rpc error: code = NotFound desc = could not find container \"cda53be9c2749fec50e22c2820824a9f3408b160eb9e4d64e920d1ba8b635e41\": container with ID starting with cda53be9c2749fec50e22c2820824a9f3408b160eb9e4d64e920d1ba8b635e41 not found: ID does not exist" Mar 21 06:01:59 crc kubenswrapper[4580]: I0321 06:01:59.637328 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a74b4ec-79b2-4983-9cd9-73e0436be439" path="/var/lib/kubelet/pods/3a74b4ec-79b2-4983-9cd9-73e0436be439/volumes" Mar 21 06:02:00 crc kubenswrapper[4580]: I0321 06:02:00.145294 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567882-5fdkh"] Mar 21 06:02:00 crc kubenswrapper[4580]: E0321 06:02:00.145854 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a74b4ec-79b2-4983-9cd9-73e0436be439" containerName="extract-content" Mar 21 06:02:00 crc kubenswrapper[4580]: I0321 06:02:00.145881 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a74b4ec-79b2-4983-9cd9-73e0436be439" containerName="extract-content" Mar 21 06:02:00 crc kubenswrapper[4580]: E0321 06:02:00.145892 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a74b4ec-79b2-4983-9cd9-73e0436be439" containerName="extract-utilities" Mar 21 06:02:00 crc kubenswrapper[4580]: I0321 06:02:00.145901 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a74b4ec-79b2-4983-9cd9-73e0436be439" containerName="extract-utilities" Mar 21 06:02:00 crc kubenswrapper[4580]: E0321 06:02:00.145917 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a74b4ec-79b2-4983-9cd9-73e0436be439" containerName="registry-server" Mar 21 06:02:00 crc kubenswrapper[4580]: I0321 06:02:00.145924 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a74b4ec-79b2-4983-9cd9-73e0436be439" containerName="registry-server" Mar 21 06:02:00 crc kubenswrapper[4580]: I0321 06:02:00.146151 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a74b4ec-79b2-4983-9cd9-73e0436be439" containerName="registry-server" Mar 21 06:02:00 crc kubenswrapper[4580]: I0321 06:02:00.146948 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567882-5fdkh" Mar 21 06:02:00 crc kubenswrapper[4580]: I0321 06:02:00.150649 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 06:02:00 crc kubenswrapper[4580]: I0321 06:02:00.152036 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 06:02:00 crc kubenswrapper[4580]: I0321 06:02:00.157555 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 06:02:00 crc kubenswrapper[4580]: I0321 06:02:00.158825 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567882-5fdkh"] Mar 21 06:02:00 crc kubenswrapper[4580]: I0321 06:02:00.315286 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8nzk\" (UniqueName: \"kubernetes.io/projected/224b0bbf-a167-49bb-93f5-335da8081c2b-kube-api-access-p8nzk\") pod \"auto-csr-approver-29567882-5fdkh\" (UID: \"224b0bbf-a167-49bb-93f5-335da8081c2b\") " pod="openshift-infra/auto-csr-approver-29567882-5fdkh" Mar 21 06:02:00 crc kubenswrapper[4580]: I0321 06:02:00.416756 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8nzk\" (UniqueName: \"kubernetes.io/projected/224b0bbf-a167-49bb-93f5-335da8081c2b-kube-api-access-p8nzk\") pod \"auto-csr-approver-29567882-5fdkh\" (UID: \"224b0bbf-a167-49bb-93f5-335da8081c2b\") " pod="openshift-infra/auto-csr-approver-29567882-5fdkh" Mar 21 06:02:00 crc kubenswrapper[4580]: I0321 06:02:00.442553 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8nzk\" (UniqueName: \"kubernetes.io/projected/224b0bbf-a167-49bb-93f5-335da8081c2b-kube-api-access-p8nzk\") pod \"auto-csr-approver-29567882-5fdkh\" (UID: \"224b0bbf-a167-49bb-93f5-335da8081c2b\") " pod="openshift-infra/auto-csr-approver-29567882-5fdkh" Mar 21 06:02:00 crc kubenswrapper[4580]: I0321 06:02:00.465171 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567882-5fdkh" Mar 21 06:02:00 crc kubenswrapper[4580]: I0321 06:02:00.963268 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567882-5fdkh"] Mar 21 06:02:01 crc kubenswrapper[4580]: I0321 06:02:01.321773 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567882-5fdkh" event={"ID":"224b0bbf-a167-49bb-93f5-335da8081c2b","Type":"ContainerStarted","Data":"f4543b9582be20362b02457d4e4eb26c4389c9c6bb9f5d6880e194bd98b8bb23"} Mar 21 06:02:01 crc kubenswrapper[4580]: I0321 06:02:01.618019 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 06:02:01 crc kubenswrapper[4580]: E0321 06:02:01.618385 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 06:02:02 crc kubenswrapper[4580]: I0321 06:02:02.331571 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567882-5fdkh" event={"ID":"224b0bbf-a167-49bb-93f5-335da8081c2b","Type":"ContainerStarted","Data":"a85b6dea80831f608b010b8ef076cb6dc8f24eb61bb41fd51563994bd802aed8"} Mar 21 06:02:02 crc kubenswrapper[4580]: I0321 06:02:02.350093 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567882-5fdkh" podStartSLOduration=1.262445117 podStartE2EDuration="2.350069674s" podCreationTimestamp="2026-03-21 06:02:00 +0000 UTC" firstStartedPulling="2026-03-21 06:02:00.960662982 +0000 UTC m=+4226.043246620" lastFinishedPulling="2026-03-21 06:02:02.048287549 +0000 UTC m=+4227.130871177" observedRunningTime="2026-03-21 06:02:02.344703869 +0000 UTC m=+4227.427287507" watchObservedRunningTime="2026-03-21 06:02:02.350069674 +0000 UTC m=+4227.432653312" Mar 21 06:02:03 crc kubenswrapper[4580]: I0321 06:02:03.343189 4580 generic.go:334] "Generic (PLEG): container finished" podID="224b0bbf-a167-49bb-93f5-335da8081c2b" containerID="a85b6dea80831f608b010b8ef076cb6dc8f24eb61bb41fd51563994bd802aed8" exitCode=0 Mar 21 06:02:03 crc kubenswrapper[4580]: I0321 06:02:03.343510 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567882-5fdkh" event={"ID":"224b0bbf-a167-49bb-93f5-335da8081c2b","Type":"ContainerDied","Data":"a85b6dea80831f608b010b8ef076cb6dc8f24eb61bb41fd51563994bd802aed8"} Mar 21 06:02:04 crc kubenswrapper[4580]: I0321 06:02:04.737875 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567882-5fdkh" Mar 21 06:02:04 crc kubenswrapper[4580]: I0321 06:02:04.906008 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8nzk\" (UniqueName: \"kubernetes.io/projected/224b0bbf-a167-49bb-93f5-335da8081c2b-kube-api-access-p8nzk\") pod \"224b0bbf-a167-49bb-93f5-335da8081c2b\" (UID: \"224b0bbf-a167-49bb-93f5-335da8081c2b\") " Mar 21 06:02:04 crc kubenswrapper[4580]: I0321 06:02:04.921025 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/224b0bbf-a167-49bb-93f5-335da8081c2b-kube-api-access-p8nzk" (OuterVolumeSpecName: "kube-api-access-p8nzk") pod "224b0bbf-a167-49bb-93f5-335da8081c2b" (UID: "224b0bbf-a167-49bb-93f5-335da8081c2b"). InnerVolumeSpecName "kube-api-access-p8nzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:02:05 crc kubenswrapper[4580]: I0321 06:02:05.008667 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8nzk\" (UniqueName: \"kubernetes.io/projected/224b0bbf-a167-49bb-93f5-335da8081c2b-kube-api-access-p8nzk\") on node \"crc\" DevicePath \"\"" Mar 21 06:02:05 crc kubenswrapper[4580]: I0321 06:02:05.380245 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567882-5fdkh" event={"ID":"224b0bbf-a167-49bb-93f5-335da8081c2b","Type":"ContainerDied","Data":"f4543b9582be20362b02457d4e4eb26c4389c9c6bb9f5d6880e194bd98b8bb23"} Mar 21 06:02:05 crc kubenswrapper[4580]: I0321 06:02:05.380614 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4543b9582be20362b02457d4e4eb26c4389c9c6bb9f5d6880e194bd98b8bb23" Mar 21 06:02:05 crc kubenswrapper[4580]: I0321 06:02:05.380683 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567882-5fdkh" Mar 21 06:02:05 crc kubenswrapper[4580]: I0321 06:02:05.431332 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567876-r6brf"] Mar 21 06:02:05 crc kubenswrapper[4580]: I0321 06:02:05.440100 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567876-r6brf"] Mar 21 06:02:05 crc kubenswrapper[4580]: I0321 06:02:05.626573 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd6acf86-a552-4342-a5e8-fb3fc12bed27" path="/var/lib/kubelet/pods/fd6acf86-a552-4342-a5e8-fb3fc12bed27/volumes" Mar 21 06:02:16 crc kubenswrapper[4580]: I0321 06:02:16.350649 4580 scope.go:117] "RemoveContainer" containerID="e27e778dbdf80afe1a92082948be63986d33051cff610253c485b5dbc3b1555c" Mar 21 06:02:16 crc kubenswrapper[4580]: I0321 06:02:16.567325 4580 scope.go:117] "RemoveContainer" containerID="25a9dc95778e9d3616efe72351a48efd85d2d781c1eed6b5c4b62cdcb9a67d24" Mar 21 06:02:16 crc kubenswrapper[4580]: I0321 06:02:16.617775 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 06:02:16 crc kubenswrapper[4580]: E0321 06:02:16.618075 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 06:02:18 crc kubenswrapper[4580]: I0321 06:02:18.685820 4580 generic.go:334] "Generic (PLEG): container finished" podID="2d986e27-d896-4e73-9a01-c99895700d10" containerID="155f69cf6352fe4ebb7de6e187c8475a9c62a77716a57bdfec71929e2f429654" exitCode=0 Mar 21 06:02:18 crc kubenswrapper[4580]: I0321 06:02:18.685945 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b4tks/must-gather-6c4ns" event={"ID":"2d986e27-d896-4e73-9a01-c99895700d10","Type":"ContainerDied","Data":"155f69cf6352fe4ebb7de6e187c8475a9c62a77716a57bdfec71929e2f429654"} Mar 21 06:02:18 crc kubenswrapper[4580]: I0321 06:02:18.686869 4580 scope.go:117] "RemoveContainer" containerID="155f69cf6352fe4ebb7de6e187c8475a9c62a77716a57bdfec71929e2f429654" Mar 21 06:02:19 crc kubenswrapper[4580]: I0321 06:02:19.507904 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b4tks_must-gather-6c4ns_2d986e27-d896-4e73-9a01-c99895700d10/gather/0.log" Mar 21 06:02:28 crc kubenswrapper[4580]: I0321 06:02:28.618554 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 06:02:28 crc kubenswrapper[4580]: E0321 06:02:28.619329 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 06:02:30 crc kubenswrapper[4580]: I0321 06:02:30.885527 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b4tks/must-gather-6c4ns"] Mar 21 06:02:30 crc kubenswrapper[4580]: I0321 06:02:30.886137 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-b4tks/must-gather-6c4ns" podUID="2d986e27-d896-4e73-9a01-c99895700d10" containerName="copy" containerID="cri-o://5313dff47ccf04da59b5ab712b584e95bbebbb240510d875928ce13240c4e1f4" gracePeriod=2 Mar 21 06:02:30 crc kubenswrapper[4580]: I0321 06:02:30.893222 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b4tks/must-gather-6c4ns"] Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.351341 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b4tks_must-gather-6c4ns_2d986e27-d896-4e73-9a01-c99895700d10/copy/0.log" Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.352022 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/must-gather-6c4ns" Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.443643 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk2bk\" (UniqueName: \"kubernetes.io/projected/2d986e27-d896-4e73-9a01-c99895700d10-kube-api-access-rk2bk\") pod \"2d986e27-d896-4e73-9a01-c99895700d10\" (UID: \"2d986e27-d896-4e73-9a01-c99895700d10\") " Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.443772 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2d986e27-d896-4e73-9a01-c99895700d10-must-gather-output\") pod \"2d986e27-d896-4e73-9a01-c99895700d10\" (UID: \"2d986e27-d896-4e73-9a01-c99895700d10\") " Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.459126 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d986e27-d896-4e73-9a01-c99895700d10-kube-api-access-rk2bk" (OuterVolumeSpecName: "kube-api-access-rk2bk") pod "2d986e27-d896-4e73-9a01-c99895700d10" (UID: "2d986e27-d896-4e73-9a01-c99895700d10"). InnerVolumeSpecName "kube-api-access-rk2bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.546228 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk2bk\" (UniqueName: \"kubernetes.io/projected/2d986e27-d896-4e73-9a01-c99895700d10-kube-api-access-rk2bk\") on node \"crc\" DevicePath \"\"" Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.604253 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d986e27-d896-4e73-9a01-c99895700d10-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2d986e27-d896-4e73-9a01-c99895700d10" (UID: "2d986e27-d896-4e73-9a01-c99895700d10"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.630832 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d986e27-d896-4e73-9a01-c99895700d10" path="/var/lib/kubelet/pods/2d986e27-d896-4e73-9a01-c99895700d10/volumes" Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.648306 4580 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2d986e27-d896-4e73-9a01-c99895700d10-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.825480 4580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b4tks_must-gather-6c4ns_2d986e27-d896-4e73-9a01-c99895700d10/copy/0.log" Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.825766 4580 generic.go:334] "Generic (PLEG): container finished" podID="2d986e27-d896-4e73-9a01-c99895700d10" containerID="5313dff47ccf04da59b5ab712b584e95bbebbb240510d875928ce13240c4e1f4" exitCode=143 Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.825832 4580 scope.go:117] "RemoveContainer" containerID="5313dff47ccf04da59b5ab712b584e95bbebbb240510d875928ce13240c4e1f4" Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.825938 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b4tks/must-gather-6c4ns" Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.844571 4580 scope.go:117] "RemoveContainer" containerID="155f69cf6352fe4ebb7de6e187c8475a9c62a77716a57bdfec71929e2f429654" Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.886281 4580 scope.go:117] "RemoveContainer" containerID="5313dff47ccf04da59b5ab712b584e95bbebbb240510d875928ce13240c4e1f4" Mar 21 06:02:31 crc kubenswrapper[4580]: E0321 06:02:31.887472 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5313dff47ccf04da59b5ab712b584e95bbebbb240510d875928ce13240c4e1f4\": container with ID starting with 5313dff47ccf04da59b5ab712b584e95bbebbb240510d875928ce13240c4e1f4 not found: ID does not exist" containerID="5313dff47ccf04da59b5ab712b584e95bbebbb240510d875928ce13240c4e1f4" Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.887515 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5313dff47ccf04da59b5ab712b584e95bbebbb240510d875928ce13240c4e1f4"} err="failed to get container status \"5313dff47ccf04da59b5ab712b584e95bbebbb240510d875928ce13240c4e1f4\": rpc error: code = NotFound desc = could not find container \"5313dff47ccf04da59b5ab712b584e95bbebbb240510d875928ce13240c4e1f4\": container with ID starting with 5313dff47ccf04da59b5ab712b584e95bbebbb240510d875928ce13240c4e1f4 not found: ID does not exist" Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.887540 4580 scope.go:117] "RemoveContainer" containerID="155f69cf6352fe4ebb7de6e187c8475a9c62a77716a57bdfec71929e2f429654" Mar 21 06:02:31 crc kubenswrapper[4580]: E0321 06:02:31.887955 4580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"155f69cf6352fe4ebb7de6e187c8475a9c62a77716a57bdfec71929e2f429654\": container with ID starting with 155f69cf6352fe4ebb7de6e187c8475a9c62a77716a57bdfec71929e2f429654 not found: ID does not exist" containerID="155f69cf6352fe4ebb7de6e187c8475a9c62a77716a57bdfec71929e2f429654" Mar 21 06:02:31 crc kubenswrapper[4580]: I0321 06:02:31.887987 4580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"155f69cf6352fe4ebb7de6e187c8475a9c62a77716a57bdfec71929e2f429654"} err="failed to get container status \"155f69cf6352fe4ebb7de6e187c8475a9c62a77716a57bdfec71929e2f429654\": rpc error: code = NotFound desc = could not find container \"155f69cf6352fe4ebb7de6e187c8475a9c62a77716a57bdfec71929e2f429654\": container with ID starting with 155f69cf6352fe4ebb7de6e187c8475a9c62a77716a57bdfec71929e2f429654 not found: ID does not exist" Mar 21 06:02:43 crc kubenswrapper[4580]: I0321 06:02:43.618615 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 06:02:43 crc kubenswrapper[4580]: E0321 06:02:43.619756 4580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7w8lj_openshift-machine-config-operator(a9668dcb-27e6-469d-aa01-da4dc9cf6664)\"" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" Mar 21 06:02:56 crc kubenswrapper[4580]: I0321 06:02:56.094910 4580 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6dbb667f95-c5g4x" podUID="21065819-f94d-4cc9-925f-c4be4eeee0d7" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 21 06:02:58 crc kubenswrapper[4580]: I0321 06:02:58.617746 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 06:02:59 crc kubenswrapper[4580]: I0321 06:02:59.091582 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"379e8e024a57dd10d5bdedc9a7778f8224ee5c20fbf1b614b603068912694be3"} Mar 21 06:03:16 crc kubenswrapper[4580]: I0321 06:03:16.714602 4580 scope.go:117] "RemoveContainer" containerID="a8377e11eea0ede5eb647c2e39b3cec914436b6b85fff3cf3d01a44d28da1e68" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.150298 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567884-jmmwc"] Mar 21 06:04:00 crc kubenswrapper[4580]: E0321 06:04:00.151597 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d986e27-d896-4e73-9a01-c99895700d10" containerName="gather" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.151624 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d986e27-d896-4e73-9a01-c99895700d10" containerName="gather" Mar 21 06:04:00 crc kubenswrapper[4580]: E0321 06:04:00.151646 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d986e27-d896-4e73-9a01-c99895700d10" containerName="copy" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.151657 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d986e27-d896-4e73-9a01-c99895700d10" containerName="copy" Mar 21 06:04:00 crc kubenswrapper[4580]: E0321 06:04:00.151692 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="224b0bbf-a167-49bb-93f5-335da8081c2b" containerName="oc" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.151706 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="224b0bbf-a167-49bb-93f5-335da8081c2b" containerName="oc" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.151969 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d986e27-d896-4e73-9a01-c99895700d10" containerName="gather" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.151986 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d986e27-d896-4e73-9a01-c99895700d10" containerName="copy" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.152009 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="224b0bbf-a167-49bb-93f5-335da8081c2b" containerName="oc" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.153884 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567884-jmmwc" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.156942 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.157730 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.163483 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.167766 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567884-jmmwc"] Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.234467 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv77x\" (UniqueName: \"kubernetes.io/projected/50d11c0b-b5e7-4ddc-9617-f167e14aa812-kube-api-access-zv77x\") pod \"auto-csr-approver-29567884-jmmwc\" (UID: \"50d11c0b-b5e7-4ddc-9617-f167e14aa812\") " pod="openshift-infra/auto-csr-approver-29567884-jmmwc" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.337128 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv77x\" (UniqueName: \"kubernetes.io/projected/50d11c0b-b5e7-4ddc-9617-f167e14aa812-kube-api-access-zv77x\") pod \"auto-csr-approver-29567884-jmmwc\" (UID: \"50d11c0b-b5e7-4ddc-9617-f167e14aa812\") " pod="openshift-infra/auto-csr-approver-29567884-jmmwc" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.381140 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv77x\" (UniqueName: \"kubernetes.io/projected/50d11c0b-b5e7-4ddc-9617-f167e14aa812-kube-api-access-zv77x\") pod \"auto-csr-approver-29567884-jmmwc\" (UID: \"50d11c0b-b5e7-4ddc-9617-f167e14aa812\") " pod="openshift-infra/auto-csr-approver-29567884-jmmwc" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.490208 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567884-jmmwc" Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.932601 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567884-jmmwc"] Mar 21 06:04:00 crc kubenswrapper[4580]: I0321 06:04:00.943216 4580 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 06:04:01 crc kubenswrapper[4580]: I0321 06:04:01.686007 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567884-jmmwc" event={"ID":"50d11c0b-b5e7-4ddc-9617-f167e14aa812","Type":"ContainerStarted","Data":"0612ec78d9b77487c3f119d7b13d5fb65744494ed8c43e9cc9bc440a8bab1c6d"} Mar 21 06:04:02 crc kubenswrapper[4580]: I0321 06:04:02.694231 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567884-jmmwc" event={"ID":"50d11c0b-b5e7-4ddc-9617-f167e14aa812","Type":"ContainerStarted","Data":"67f031a1772103eb9c30f21f2551ce819c2a73763169573e48c0a159f8faab63"} Mar 21 06:04:02 crc kubenswrapper[4580]: I0321 06:04:02.715146 4580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567884-jmmwc" podStartSLOduration=1.703824574 podStartE2EDuration="2.715128814s" podCreationTimestamp="2026-03-21 06:04:00 +0000 UTC" firstStartedPulling="2026-03-21 06:04:00.942980986 +0000 UTC m=+4346.025564614" lastFinishedPulling="2026-03-21 06:04:01.954285226 +0000 UTC m=+4347.036868854" observedRunningTime="2026-03-21 06:04:02.709349528 +0000 UTC m=+4347.791933166" watchObservedRunningTime="2026-03-21 06:04:02.715128814 +0000 UTC m=+4347.797712442" Mar 21 06:04:03 crc kubenswrapper[4580]: I0321 06:04:03.705096 4580 generic.go:334] "Generic (PLEG): container finished" podID="50d11c0b-b5e7-4ddc-9617-f167e14aa812" containerID="67f031a1772103eb9c30f21f2551ce819c2a73763169573e48c0a159f8faab63" exitCode=0 Mar 21 06:04:03 crc kubenswrapper[4580]: I0321 06:04:03.705212 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567884-jmmwc" event={"ID":"50d11c0b-b5e7-4ddc-9617-f167e14aa812","Type":"ContainerDied","Data":"67f031a1772103eb9c30f21f2551ce819c2a73763169573e48c0a159f8faab63"} Mar 21 06:04:05 crc kubenswrapper[4580]: I0321 06:04:05.026347 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567884-jmmwc" Mar 21 06:04:05 crc kubenswrapper[4580]: I0321 06:04:05.128091 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv77x\" (UniqueName: \"kubernetes.io/projected/50d11c0b-b5e7-4ddc-9617-f167e14aa812-kube-api-access-zv77x\") pod \"50d11c0b-b5e7-4ddc-9617-f167e14aa812\" (UID: \"50d11c0b-b5e7-4ddc-9617-f167e14aa812\") " Mar 21 06:04:05 crc kubenswrapper[4580]: I0321 06:04:05.134125 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50d11c0b-b5e7-4ddc-9617-f167e14aa812-kube-api-access-zv77x" (OuterVolumeSpecName: "kube-api-access-zv77x") pod "50d11c0b-b5e7-4ddc-9617-f167e14aa812" (UID: "50d11c0b-b5e7-4ddc-9617-f167e14aa812"). InnerVolumeSpecName "kube-api-access-zv77x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:04:05 crc kubenswrapper[4580]: I0321 06:04:05.231392 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv77x\" (UniqueName: \"kubernetes.io/projected/50d11c0b-b5e7-4ddc-9617-f167e14aa812-kube-api-access-zv77x\") on node \"crc\" DevicePath \"\"" Mar 21 06:04:05 crc kubenswrapper[4580]: I0321 06:04:05.731298 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567884-jmmwc" event={"ID":"50d11c0b-b5e7-4ddc-9617-f167e14aa812","Type":"ContainerDied","Data":"0612ec78d9b77487c3f119d7b13d5fb65744494ed8c43e9cc9bc440a8bab1c6d"} Mar 21 06:04:05 crc kubenswrapper[4580]: I0321 06:04:05.731353 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0612ec78d9b77487c3f119d7b13d5fb65744494ed8c43e9cc9bc440a8bab1c6d" Mar 21 06:04:05 crc kubenswrapper[4580]: I0321 06:04:05.731419 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567884-jmmwc" Mar 21 06:04:05 crc kubenswrapper[4580]: I0321 06:04:05.777210 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567878-tlz74"] Mar 21 06:04:05 crc kubenswrapper[4580]: I0321 06:04:05.787450 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567878-tlz74"] Mar 21 06:04:07 crc kubenswrapper[4580]: I0321 06:04:07.632605 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2dd27b1-8529-4275-9035-ac3c8622f707" path="/var/lib/kubelet/pods/a2dd27b1-8529-4275-9035-ac3c8622f707/volumes" Mar 21 06:04:16 crc kubenswrapper[4580]: I0321 06:04:16.812996 4580 scope.go:117] "RemoveContainer" containerID="2a78ff4ba5755faf0d13e2677de6ac88c3782d5a73b424adf91c4eabed5f0a03" Mar 21 06:05:15 crc kubenswrapper[4580]: I0321 06:05:15.948419 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 06:05:15 crc kubenswrapper[4580]: I0321 06:05:15.949209 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 06:05:45 crc kubenswrapper[4580]: I0321 06:05:45.947844 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 06:05:45 crc kubenswrapper[4580]: I0321 06:05:45.948390 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 06:06:00 crc kubenswrapper[4580]: I0321 06:06:00.140122 4580 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567886-kng75"] Mar 21 06:06:00 crc kubenswrapper[4580]: E0321 06:06:00.141236 4580 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d11c0b-b5e7-4ddc-9617-f167e14aa812" containerName="oc" Mar 21 06:06:00 crc kubenswrapper[4580]: I0321 06:06:00.141253 4580 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d11c0b-b5e7-4ddc-9617-f167e14aa812" containerName="oc" Mar 21 06:06:00 crc kubenswrapper[4580]: I0321 06:06:00.141454 4580 memory_manager.go:354] "RemoveStaleState removing state" podUID="50d11c0b-b5e7-4ddc-9617-f167e14aa812" containerName="oc" Mar 21 06:06:00 crc kubenswrapper[4580]: I0321 06:06:00.142138 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567886-kng75" Mar 21 06:06:00 crc kubenswrapper[4580]: I0321 06:06:00.144219 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 06:06:00 crc kubenswrapper[4580]: I0321 06:06:00.144429 4580 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6fthc" Mar 21 06:06:00 crc kubenswrapper[4580]: I0321 06:06:00.155874 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567886-kng75"] Mar 21 06:06:00 crc kubenswrapper[4580]: I0321 06:06:00.160669 4580 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 06:06:00 crc kubenswrapper[4580]: I0321 06:06:00.166730 4580 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql4w7\" (UniqueName: \"kubernetes.io/projected/6fd7ae63-aef1-44fe-bd11-ec5c88d6780f-kube-api-access-ql4w7\") pod \"auto-csr-approver-29567886-kng75\" (UID: \"6fd7ae63-aef1-44fe-bd11-ec5c88d6780f\") " pod="openshift-infra/auto-csr-approver-29567886-kng75" Mar 21 06:06:00 crc kubenswrapper[4580]: I0321 06:06:00.268079 4580 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql4w7\" (UniqueName: \"kubernetes.io/projected/6fd7ae63-aef1-44fe-bd11-ec5c88d6780f-kube-api-access-ql4w7\") pod \"auto-csr-approver-29567886-kng75\" (UID: \"6fd7ae63-aef1-44fe-bd11-ec5c88d6780f\") " pod="openshift-infra/auto-csr-approver-29567886-kng75" Mar 21 06:06:00 crc kubenswrapper[4580]: I0321 06:06:00.288369 4580 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql4w7\" (UniqueName: \"kubernetes.io/projected/6fd7ae63-aef1-44fe-bd11-ec5c88d6780f-kube-api-access-ql4w7\") pod \"auto-csr-approver-29567886-kng75\" (UID: \"6fd7ae63-aef1-44fe-bd11-ec5c88d6780f\") " pod="openshift-infra/auto-csr-approver-29567886-kng75" Mar 21 06:06:00 crc kubenswrapper[4580]: I0321 06:06:00.461987 4580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567886-kng75" Mar 21 06:06:00 crc kubenswrapper[4580]: I0321 06:06:00.943232 4580 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567886-kng75"] Mar 21 06:06:00 crc kubenswrapper[4580]: W0321 06:06:00.953249 4580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fd7ae63_aef1_44fe_bd11_ec5c88d6780f.slice/crio-4ed745a92a6524059fdc7e506c0e2897b3a3b56aaa6b448c8ba01cb471e81dd9 WatchSource:0}: Error finding container 4ed745a92a6524059fdc7e506c0e2897b3a3b56aaa6b448c8ba01cb471e81dd9: Status 404 returned error can't find the container with id 4ed745a92a6524059fdc7e506c0e2897b3a3b56aaa6b448c8ba01cb471e81dd9 Mar 21 06:06:00 crc kubenswrapper[4580]: I0321 06:06:00.978063 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567886-kng75" event={"ID":"6fd7ae63-aef1-44fe-bd11-ec5c88d6780f","Type":"ContainerStarted","Data":"4ed745a92a6524059fdc7e506c0e2897b3a3b56aaa6b448c8ba01cb471e81dd9"} Mar 21 06:06:02 crc kubenswrapper[4580]: I0321 06:06:02.995338 4580 generic.go:334] "Generic (PLEG): container finished" podID="6fd7ae63-aef1-44fe-bd11-ec5c88d6780f" containerID="106e505367e890bd1afc5944678143f1138bd7f970641057d34082b6e523862b" exitCode=0 Mar 21 06:06:02 crc kubenswrapper[4580]: I0321 06:06:02.995428 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567886-kng75" event={"ID":"6fd7ae63-aef1-44fe-bd11-ec5c88d6780f","Type":"ContainerDied","Data":"106e505367e890bd1afc5944678143f1138bd7f970641057d34082b6e523862b"} Mar 21 06:06:04 crc kubenswrapper[4580]: I0321 06:06:04.438306 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567886-kng75" Mar 21 06:06:04 crc kubenswrapper[4580]: I0321 06:06:04.478293 4580 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql4w7\" (UniqueName: \"kubernetes.io/projected/6fd7ae63-aef1-44fe-bd11-ec5c88d6780f-kube-api-access-ql4w7\") pod \"6fd7ae63-aef1-44fe-bd11-ec5c88d6780f\" (UID: \"6fd7ae63-aef1-44fe-bd11-ec5c88d6780f\") " Mar 21 06:06:04 crc kubenswrapper[4580]: I0321 06:06:04.494340 4580 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd7ae63-aef1-44fe-bd11-ec5c88d6780f-kube-api-access-ql4w7" (OuterVolumeSpecName: "kube-api-access-ql4w7") pod "6fd7ae63-aef1-44fe-bd11-ec5c88d6780f" (UID: "6fd7ae63-aef1-44fe-bd11-ec5c88d6780f"). InnerVolumeSpecName "kube-api-access-ql4w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:06:04 crc kubenswrapper[4580]: I0321 06:06:04.580635 4580 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql4w7\" (UniqueName: \"kubernetes.io/projected/6fd7ae63-aef1-44fe-bd11-ec5c88d6780f-kube-api-access-ql4w7\") on node \"crc\" DevicePath \"\"" Mar 21 06:06:05 crc kubenswrapper[4580]: I0321 06:06:05.025131 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567886-kng75" event={"ID":"6fd7ae63-aef1-44fe-bd11-ec5c88d6780f","Type":"ContainerDied","Data":"4ed745a92a6524059fdc7e506c0e2897b3a3b56aaa6b448c8ba01cb471e81dd9"} Mar 21 06:06:05 crc kubenswrapper[4580]: I0321 06:06:05.025215 4580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ed745a92a6524059fdc7e506c0e2897b3a3b56aaa6b448c8ba01cb471e81dd9" Mar 21 06:06:05 crc kubenswrapper[4580]: I0321 06:06:05.025977 4580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567886-kng75" Mar 21 06:06:05 crc kubenswrapper[4580]: E0321 06:06:05.278076 4580 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fd7ae63_aef1_44fe_bd11_ec5c88d6780f.slice\": RecentStats: unable to find data in memory cache]" Mar 21 06:06:05 crc kubenswrapper[4580]: I0321 06:06:05.534441 4580 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567880-q9vcb"] Mar 21 06:06:05 crc kubenswrapper[4580]: I0321 06:06:05.544237 4580 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567880-q9vcb"] Mar 21 06:06:05 crc kubenswrapper[4580]: I0321 06:06:05.631941 4580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b24ee55-d60e-4fc2-bb1e-2d42e45b8453" path="/var/lib/kubelet/pods/5b24ee55-d60e-4fc2-bb1e-2d42e45b8453/volumes" Mar 21 06:06:15 crc kubenswrapper[4580]: I0321 06:06:15.947723 4580 patch_prober.go:28] interesting pod/machine-config-daemon-7w8lj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 06:06:15 crc kubenswrapper[4580]: I0321 06:06:15.948378 4580 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 06:06:15 crc kubenswrapper[4580]: I0321 06:06:15.948426 4580 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" Mar 21 06:06:15 crc kubenswrapper[4580]: I0321 06:06:15.949370 4580 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"379e8e024a57dd10d5bdedc9a7778f8224ee5c20fbf1b614b603068912694be3"} pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 06:06:15 crc kubenswrapper[4580]: I0321 06:06:15.949437 4580 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" podUID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerName="machine-config-daemon" containerID="cri-o://379e8e024a57dd10d5bdedc9a7778f8224ee5c20fbf1b614b603068912694be3" gracePeriod=600 Mar 21 06:06:16 crc kubenswrapper[4580]: I0321 06:06:16.135687 4580 generic.go:334] "Generic (PLEG): container finished" podID="a9668dcb-27e6-469d-aa01-da4dc9cf6664" containerID="379e8e024a57dd10d5bdedc9a7778f8224ee5c20fbf1b614b603068912694be3" exitCode=0 Mar 21 06:06:16 crc kubenswrapper[4580]: I0321 06:06:16.135734 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerDied","Data":"379e8e024a57dd10d5bdedc9a7778f8224ee5c20fbf1b614b603068912694be3"} Mar 21 06:06:16 crc kubenswrapper[4580]: I0321 06:06:16.135963 4580 scope.go:117] "RemoveContainer" containerID="e879f8dccf4bc258353e8d0b36e9014e0957a1695cfcb7085466c77be95c3b12" Mar 21 06:06:16 crc kubenswrapper[4580]: I0321 06:06:16.950765 4580 scope.go:117] "RemoveContainer" containerID="c499ab666c033a7376d6883491e99934b783d5bb344d9ca38e3935762c28405e" Mar 21 06:06:17 crc kubenswrapper[4580]: I0321 06:06:17.145470 4580 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7w8lj" event={"ID":"a9668dcb-27e6-469d-aa01-da4dc9cf6664","Type":"ContainerStarted","Data":"d41b95a291bd1ecd82a25ce1dfd4162a33368d55af65c2b9c658e4a4ae251692"}